CN115268736A - Interface switching method and electronic equipment - Google Patents

Interface switching method and electronic equipment Download PDF

Info

Publication number
CN115268736A
CN115268736A CN202110483579.1A CN202110483579A CN115268736A CN 115268736 A CN115268736 A CN 115268736A CN 202110483579 A CN202110483579 A CN 202110483579A CN 115268736 A CN115268736 A CN 115268736A
Authority
CN
China
Prior art keywords
interface
trigger instruction
context data
switching
content information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110483579.1A
Other languages
Chinese (zh)
Inventor
顾佳熙
陈开济
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110483579.1A priority Critical patent/CN115268736A/en
Priority to PCT/CN2022/085396 priority patent/WO2022228066A1/en
Publication of CN115268736A publication Critical patent/CN115268736A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Health & Medical Sciences (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Computational Linguistics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the field of terminal interface display and terminal interface switching (skipping), in particular to an interface switching method and electronic equipment, wherein the method comprises the following steps: displaying a first interface; receiving a trigger instruction, wherein the trigger instruction is used for triggering interface switching; the trigger instruction comprises the process of the second interface and the content information of the second interface; acquiring context data associated with the second interface according to the process of the second interface and the content information of the second interface contained in the trigger instruction; switching from the first interface to the second interface displayed in accordance with the contextual data associated with the second interface. Based on the technical scheme provided by the application, the switching of the application specific interface can be realized, and the experience of the user is optimized.

Description

Interface switching method and electronic equipment
Technical Field
The present disclosure relates to the field of terminal display, and in particular, to an interface switching method and an electronic device.
Background
With the development of terminals, the functions of terminals represented by mobile phones become more and more powerful, and applications supported by the terminals also become more and more abundant. During the process of using the terminal, a user often needs to switch back and forth among a plurality of interfaces of a plurality of applications.
Currently, a user can perform interface switching by clicking a jump link; alternatively, the user may switch to the interfaces of multiple applications displayed in the task manager based on the task manager, but the user can only switch to the interface newly opened by a different application through the task manager. The current interface switching method is not intelligent enough, can not meet the diversified interface switching requirements of users, and has poor user experience.
Disclosure of Invention
In view of the above problems in the prior art, the application provides an interface switching method and an electronic device, which can provide a more intelligent interface switching function for a user and optimize the experience of the user.
In order to achieve the above object, a first aspect of the present application provides an interface switching method, including:
displaying a first interface;
receiving a trigger instruction, wherein the trigger instruction is used for triggering interface switching; the trigger instruction comprises the process of the second interface and the content information of the second interface;
acquiring context data associated with the second interface according to the process of the second interface and the content information of the second interface contained in the trigger instruction;
switching from the first interface to the second interface displayed in accordance with the contextual data associated with the second interface.
Therefore, based on the interface switching method provided by the invention, the interface switching granularity comprises the switching process and the content information, the interface switching granularity is refined, and the conflict of interface switching requests can be effectively avoided.
As an implementation manner of the first aspect, the second interface is an interface opened in history; the cache stores the process of the second interface, the content information of the second interface and the context data associated with the second interface.
Therefore, the second interface is an interface opened in history, that is, the related interface information and the skip information of the second interface are stored in the cache, so that the second interface can be switched quickly.
As an implementation manner of the first aspect, the process of the second interface, the content information of the second interface, and the context data associated with the second interface are cached in the cache in the form of a queue, respectively.
Therefore, the process of the second interface, the content information of the second interface and the hierarchical caching of the context data associated with the second interface are realized by adopting the multi-level queues, and all levels can not influence each other, so that the caching is more flexible.
As an implementation manner of the first aspect, the triggering instruction further includes: a layout of the second interface. Therefore, the interface switching granularity also comprises the interface layout, so that the interface can be switched to the granularity of the interface layout, and the conflict of interface switching requests is avoided.
As an implementation manner of the first aspect, the context data associated with the second interface is a link to jump to the second interface or address information of the second interface.
As an implementation manner of the first aspect, after the switching from the displayed first interface to the displayed second interface, the method further includes:
refreshing the data stored in the cache using a least recently used algorithm.
Therefore, the data stored in the cache is refreshed by using the least recently used algorithm, so that the updating of the old node and the new node is realized.
As an implementation manner of the first aspect, when the second interface requested by the trigger instruction is opened for the first time, the process, the content information, and the context data matched with the interface are extracted, and the extracted process, the extracted content information, and the extracted context data matched with the interface are stored.
Therefore, when the second interface requested by the trigger instruction is opened for the first time, the information related to the second interface is stored, the information is convenient to use next time, the speed and the efficiency of switching to the second interface next time can be improved, and the user experience is improved.
As an implementation manner of the first aspect, when the trigger instruction is a voice trigger instruction, slot position information of the trigger instruction is obtained in a voice recognition manner; determining a progress of the second interface, content information of the second interface and context data associated with the second interface according to the slot position information.
As an implementation manner of the first aspect, the second interface is a network interface opened by interacting with a server; or, the second interface is a local interface.
As an implementation manner of the first aspect, the second interface is a shopping item related interface.
As an implementation manner of the first aspect, the second interface is an album-related interface.
A second aspect of the present application provides an electronic device including:
the display module is used for displaying a first interface;
the receiving module is used for receiving a triggering instruction, and the triggering instruction is used for triggering interface switching; the trigger instruction comprises the process of the second interface and the content information of the second interface;
the obtaining module is used for obtaining context data associated with the second interface according to the process of the second interface and the content information of the second interface contained in the trigger instruction;
a switching module for switching from the displayed first interface to the second interface according to the context data associated with the second interface.
As an implementation manner of the second aspect, the second interface is an interface opened in history; the cache stores the process of the second interface, the content information of the second interface and the context data associated with the second interface.
As an implementation manner of the second aspect, the process of the second interface, the content information of the second interface, and the context data associated with the second interface are cached in the cache in the form of a queue, respectively.
As an implementation manner of the second aspect, the triggering instruction further includes: a layout of the second interface.
As an implementation manner of the second aspect, the context data associated with the second interface is a link to jump to the second interface or address information of the second interface. As an implementation manner of the second aspect, after the switching from the displayed first interface to the displayed second interface, the method further includes:
refreshing the data stored in the cache by using a least recently used algorithm.
When a second interface requested by the trigger instruction is opened for the first time, extracting the process, the content information and the context data matched with the interface, and storing the extracted process, the content information and the context data matched with the interface
As an implementation manner of the second aspect, when the trigger instruction is a voice trigger instruction, slot position information of the trigger instruction is obtained in a voice recognition manner; determining a progress of the second interface, content information of the second interface, and context data associated with the second interface according to the slot position information.
As an implementation manner of the second aspect, the second interface is a network interface opened by interacting with a server; or, the second interface is a local interface.
As an implementation manner of the second aspect, the second interface is a shopping item related interface. As an implementation manner of the second aspect, the second interface is an album-related interface.
A third aspect of the present application provides a terminal, including a memory and a processor, where the memory stores executable codes, and the processor executes the executable codes to implement the interface switching method according to any one of the above first aspect.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon program instructions that, when executed by a computer, cause the computer to execute the interface switching method of any one of the above first aspects.
A fifth aspect of the present application provides a computer program product, which when run on a computing device, causes the computing device to perform the interface switching method of any one of the above first aspects. These and other aspects of the present application will be more readily apparent from the following description of the embodiment(s).
Drawings
The various features and the connections between the various features of the present application are further described below with reference to the drawings. The figures are exemplary, some features are not shown to scale and some of the figures may omit features customary in the art to which this application relates and which are not essential to the application or show additional features which are not essential to the application, the combination of features shown in the figures is not intended to limit the application. In addition, the same reference numerals are used throughout the specification to designate the same components. The specific drawings are illustrated as follows:
fig. 1 is a schematic structural diagram of a terminal applicable to the interface switching method provided in the embodiment of the present application;
fig. 2 is a flowchart of an interface switching method according to an embodiment of the present disclosure;
fig. 3 is a diagram illustrating an example of a data structure of an interface abstract and context data corresponding to the interface abstract according to an embodiment of the present disclosure;
fig. 4 is a diagram illustrating another example of an interface abstract and a data structure of context data corresponding to the interface abstract according to an embodiment of the present disclosure;
fig. 5 is a flowchart of another implementation manner of an interface switching method according to an embodiment of the present application;
fig. 6 is another flowchart of an interface switching method according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The terms "first, second, third and the like" or "module a, module B, module C and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order, it being understood that specific orders or sequences may be interchanged where permissible to effect embodiments of the present application in other than those illustrated or described herein.
In the following description, reference numerals indicating steps such as S110, S120 \ 8230 \8230 \ 8230, etc. do not necessarily indicate that the steps are performed, and the order of the front and rear steps may be interchanged or performed simultaneously, where the case allows. Where permitted, additional or fewer steps may be included, and the description is not particularly limited.
The term "comprising" as used in the specification and claims should not be construed as being limited to the contents listed thereafter; it does not exclude other elements or steps. It is thus to be interpreted as specifying the presence of the stated features, integers, steps or components as referred to, but does not preclude the presence or addition of one or more other features, integers, steps or components, or groups thereof. Thus, the expression "an apparatus comprising the devices a and B" should not be limited to an apparatus consisting of only the components a and B.
Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the application. Thus, appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments, as would be apparent to one of ordinary skill in the art from this disclosure.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. In the case of inconsistency, the meaning described in the present specification or the meaning derived from the content described in the present specification shall control. In addition, the terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the present application.
Before further detailed description of the embodiments of the present application, terms and expressions mentioned in the embodiments of the present application, and their corresponding uses, functions, and functions, etc. are described, the terms and expressions mentioned in the embodiments of the present application are applicable to the following explanations:
1. interface abstract technology: it is a technique for constructing an abstract of an interface by extracting the most representative information in the interface. The interface abstract directly aims to identify the interface so as to realize the identification, the differentiation, the retrieval and the like of the interface through the identification. The interface abstract of the coarse granularity can comprise a process corresponding to the interface as an abstract of the interface; the fine-grained interface summarization may include analyzing the content displayed by the interface to obtain a summary of the interface. Therefore, the granularity of the interface abstract is related to the capability of the interface abstract for solving the conflict of the request, and the interface abstract with fine granularity can avoid the conflict of the service request.
2. Least Recently Used algorithm (LRU): the method is a page replacement algorithm, and the least recently used algorithm is used for eliminating the page which is not used for the longest time, namely, the unused page which has the longest time from the current moment is eliminated.
The embodiment of the application provides an interface switching method, which is used for realizing interface switching by a user through caching information of an opened interface and optimizing experience of user interface switching. Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
First, a scene to which the interface switching method provided in the embodiment of the present application is applied is introduced.
The interface switching method provided by the embodiment of the application can be applied to any terminal with a display screen and a basic interaction function, such as a mobile phone, a tablet computer, a wearable device with a wireless communication function (e.g., a smart watch, a bracelet, a smart helmet, etc.), an in-vehicle device, smart furniture, an Augmented Reality (AR) device, a Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), etc. In the interface switching method provided by the embodiment of the application, a user can initiate an interface trigger instruction in a natural language or touch key manner and the like, and respond to the trigger instruction on a terminal so as to realize the rapid switching of a specific interface. By adopting the method, the switching of the specific interface can be realized, the intelligence of interface switching is improved, and the user experience is improved.
The following description will take an example of applying the method provided by the embodiment of the present application to a mobile phone with a display screen and a basic interactive function.
Fig. 1 is a schematic structural diagram of a terminal 100 to which an embodiment of the present application is applicable.
As shown in fig. 1, the terminal 100 may include a processor 110, a memory 120, a charging management module 130, a power management module 131, a battery 132, an antenna 1, an antenna 2, a mobile communication module 140, a wireless communication module 150, an audio module 160, a speaker 160A, a microphone 160B, a microphone 160C, keys 170, a motor 171, an indicator 172, a display 172, and the like. Wherein.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the terminal 100. In other embodiments of the present application, terminal 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 is configured to execute the interface switching method provided in the embodiment of the present application. Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
As an implementation, the processor 110 may be a neural-network (NN) computing processor, which processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and may also continuously learn by itself. To realize applications such as intelligent recognition of the terminal 100, for example: speech command recognition, text command understanding, etc.
The memory 120 may be configured to store a computer-executable program code, that is, a computer-executable program code related to the interface switching method provided in the embodiment of the present application, where the storage includes storing an abstract and context data of an opened interface. In some embodiments, memory 120 may be a cache memory. The memory 120 may store instructions or data that are used or used more frequently by the processor 110. If the processor 110 needs to use the instructions or data, it can call directly from the memory 120. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system. In some embodiments, memory 120 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like. The storage data area may store data created during use of the terminal 100, and the like. In addition, the internal memory 120 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like. The processor 110 performs various functional methods or data processing of the terminal 100 by executing instructions stored in the memory 120 and/or instructions stored in a memory provided in the processor. The charging management module 130 is used for receiving a charging input of the charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 130 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 130 may receive a wireless charging input through a wireless charging coil of the terminal 100. The charging management module 130 may also supply power to the electronic device through the power management module 131 while charging the battery 132.
The power management module 131 is used to connect the battery 132, the charging management module 130 and the processor 110. The power management module 131 receives an input from the battery 132 and/or the charging management module 130, and supplies power to the processor 110, the internal memory 121, the display 172, the wireless communication module 150, and the like. The power management module 131 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In other embodiments, the power management module 131 may be disposed in the processor 110. In other embodiments, the power management module 131 and the charging management module 130 may be disposed in the same device.
The wireless communication function of the terminal 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 140, the wireless communication module 150, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in terminal 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 140 may provide a solution including 2G/3G/4G/5G wireless communication and the like applied on the terminal 100. The mobile communication module 140 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 140 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the filtered electromagnetic wave to the modem processor for demodulation. The mobile communication module 140 can also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 140 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 140 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 150 may provide a solution for wireless communication applied to the terminal 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (BT), bluetooth Low Energy (BLE), ultra Wide Band (UWB), global Navigation Satellite System (GNSS), frequency Modulation (FM), short-range wireless communication (NFC), infrared (IR), and the like. The wireless communication module 150 may be one or more devices integrating at least one communication processing module. The wireless communication module 150 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 150 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the terminal 100 is coupled to the mobile communication module 140 and the antenna 2 is coupled to the wireless communication module 150, so that the terminal 100 can communicate with networks and other electronic devices through wireless communication technology.
The terminal 100 may implement display functionality via the GPU, display screen 172, and application processor, among other things. The GPU is a microprocessor for image processing, and is connected to the display screen 172 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 172 is used to display a first interface, a second interface, and the like. The display screen 172 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, terminal 100 may include 1 or more display screens 172.
The terminal 100 may implement a camera function through the camera module 193, isp, video codec, GPU, the display 172, the application processor AP, the neural network processor NPU, and the like.
The terminal 100 may implement audio functions through the audio module 160, the speaker 160A, the receiver 160B, the microphone 160C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 160 is used to convert digital audio information into an analog audio signal output and also used to convert an analog audio input into a digital audio signal. The audio module 160 may also be used to encode and decode audio signals. In some embodiments, the audio module 160 may be disposed in the processor 110, or some functional modules of the audio module 160 may be disposed in the processor 110.
The speaker 160A, also called a "horn", is used to convert electrical audio signals into sound signals. The terminal 100 can listen to music through the speaker 160A or output an audio signal for a handsfree phone call.
The receiver 160B, also called "handset", is used to convert the received voice trigger command (electrical signal) into an acoustic signal.
Microphone 160C, also known as a "microphone," is used to convert acoustic signals into electrical signals, such as: the voice command is converted into an electric signal. When making a call or transmitting voice information, the user can input a voice signal to the microphone 160C by speaking near the microphone 160C through the mouth. The terminal 100 may be provided with at least one microphone 160C. In other embodiments, the terminal 100 may be provided with two microphones 160C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal 100 may further include three, four or more microphones 160C to collect voice signals, reduce noise, identify voice sources, implement directional recording functions, and so on.
Keys 170 may include a power on key, a volume key, and the like. The keys 170 may be mechanical keys. Or may be touch keys. The terminal 100 may receive a key input, and generate a key signal input related to user setting and function control of the terminal 100.
The motor 171 may generate a vibration indication. The motor 171 may be used for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 171 may also respond to different vibration feedback effects by performing touch operations on different areas of the display screen 172. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 172 may be an indicator light, may be used to indicate a state of charge, a change in charge, may be used to indicate a message, a notification, etc.
Exemplarily, as shown in fig. 2, an interface switching method provided in the embodiment of the present application is provided. The method comprises steps S110-S150, which are described in turn as follows:
s110: the terminal receives an interface trigger instruction initiated by a user and analyzes the trigger instruction.
As an alternative implementation manner, a user may actively initiate a trigger instruction for a terminal interface through an interface interaction portal of the terminal, where the trigger instruction includes, but is not limited to: a trigger instruction initiated by voice (such as a voice trigger instruction of "open a memo", "open a hot search of certain social software", and the like), a trigger instruction initiated by a touch key (such as a user clicking an album icon), and the like.
Specifically, if the trigger instruction obtained by analyzing is an instruction for opening an interface, for example, a new interface is opened, for example, a touch button of the interface is clicked, or an intention of analyzing a voice instruction is to open the interface, step S120 is executed; if the analyzed trigger instruction is an instruction for initiating an interface jump, for example, a jump is made back to a previously browsed interface, step S140 is executed. The instruction of the interface jump can be triggered by a control of the interface quick jump or when the intention of the voice instruction is recognized as the interface quick jump.
As an optional implementation manner, the instruction for opening the interface may be to open the interface requested by the instruction for the first time in the terminal, or may be to store no interface summary requested by the instruction and its corresponding context data in a buffer of the terminal. The interface jump instruction is to open the interface requested by the instruction in the terminal not for the first time, that is, the interface abstract requested by the instruction and the corresponding context data are stored in the buffer of the terminal.
As another optional implementation manner, if the trigger instruction is an instruction triggered by voice, the instruction may be determined to be an instruction to open an interface or an instruction to initiate an interface jump by performing voice recognition on the instruction. For example: when the instruction comprises related words such as opening and opening, the instruction is judged as an instruction for opening the interface; and when the instruction comprises related words such as jump, turn-to, switch and return and the like, the instruction is obtained through voice recognition and is judged as an instruction for initiating interface jump. For another example: the instruction may also be determined to be an instruction to open an interface or an instruction to initiate an interface jump based on a natural language processing model. The natural language processing model may be a deep learning-based natural language processing model, and the embodiment does not specifically limit the natural language processing model.
S120: and the terminal displays the interface requested by the trigger instruction according to the trigger instruction and extracts the interface abstract of the interface.
As an optional implementation manner, after receiving the trigger instruction, the terminal may initiate a request for invoking an interface requested in the trigger instruction to the server, and when receiving the requested interface sent by the server, display the interface. When or after the interface is displayed, the terminal can extract the interface abstract of the interface by using an interface abstract technology. For example: the received trigger instruction may be: open a popular video for a short video application, open an order for a take-away software, etc.
As another optional implementation manner, after receiving the trigger instruction, the terminal may also directly obtain the abstract and the related context data of the interface corresponding to the trigger instruction from the terminal locally, and display the interface. When or after the interface is displayed, the terminal can extract the interface abstract of the interface by using an interface abstract technology. For example: the received trigger instruction may be: opening a memo, opening a notepad, opening an album, and the like.
Optionally, the trigger instruction is a user voice instruction, and the requested interface is displayed according to semantics carried by the trigger instruction.
S130: the terminal stores the interface abstract extracted in step S120 and the context data corresponding to the interface abstract in a buffer.
And the context data corresponding to the interface abstract is used for realizing jumping to the interface according to the interface abstract. The context data corresponding to the interface may be a link of the interface, such as a URL link, an interface address, etc. The context data corresponding to the interface may be in other forms, and the embodiment of the present application is not limited.
Optionally, the terminal may also refresh the cached interface abstract. The interface digest cached in the cache is refreshed, for example, using an LRU algorithm.
Specifically, the step of refreshing the interface summary cached in the cache by using the LRU algorithm includes: and inserting a new interface abstract in a buffer of the terminal, namely inserting a new node in the nth layer of the interface abstract, deleting the node which is least recently used according to the LRU strategy when the number of the nodes of the layer exceeds the constraint length of the layer, inserting the new node, namely deleting the node at the head of the layer queue (the node which is least recently used), and inserting the new node at the tail of the layer queue. Wherein n is equal to {1,2,3}, and represents process, layout and content respectively.
S140: and the terminal acquires an interface abstract matched with the interface requested by the trigger instruction and context data corresponding to the interface abstract from the buffer according to the trigger instruction.
As an optional implementation manner, the data structure of the interface abstract and the context data corresponding to the interface abstract is shown in fig. 3, and the data structure including the interface abstract and the context data corresponding to the interface abstract is shown in fig. 3, and includes an interface abstract portion 410 and context data 420 for implementing a jump to a corresponding interface according to the interface abstract.
In particular, interface summary section 410 includes processes and content. In the process, the length of the layer can be constrained by max _ lem1, and when the number of process nodes in the layer exceeds max _ lem1, the process nodes are refreshed by using the LRU algorithm. In content, the length of the layer can be constrained by max _ lem2, and when a content node in the layer exceeds max _ lem2, the content node is refreshed using the LRU algorithm.
Specifically, the context data 420 is an interface context (PageContext) shown in fig. 3. For example, in the Android system, the Context data is Context (Context) class data.
As another alternative implementation manner, the data structure of the interface abstract and the context data corresponding to the interface abstract is shown in fig. 4, and includes an interface abstract portion 510 and context data 520 for implementing a jump to a corresponding interface according to the interface abstract.
Optionally, interface summary section 510 may include processes, layouts, and content. In the process, max _ len1 can be used to constrain the length of the layer, and when the number of process nodes in the layer exceeds max _ len1, the process nodes are refreshed by the LRU algorithm. In the layout, the length of the layer can be constrained by max _ len2, and when a layout node in the layer exceeds max _ len2, the layout node is refreshed using the LRU algorithm. In the content, max _ len3 can be used to constrain the length of the layer, and when the content node in the layer exceeds max _ len3, the content node is refreshed by the LRU algorithm.
Specifically, the refreshing each layer of nodes by using the LRU algorithm in this step includes: and performing matching query in a buffer of the terminal, and when the nth layer in the interface abstract is successfully retrieved, namely the matching is successful, updating the node position by the nth layer according to the LRU strategy, and moving the node to the tail of the layer queue to represent the node which is accessed recently. Wherein n ∈ {1,2,3}, which respectively represents the progress of the interface, the layout of the interface, and the content of the interface.
In this embodiment, the context data 520 is an interface context (PageContext) shown in fig. 4. In the embodiment of the present application, the process in the interface abstract is an application name or a software ID package name, for example: and (C) applying the B. The layout in the interface abstract is a presentation mode preset by the interface, for example: the layout of the commodity detail page in the shopping software A is a continuous picture layout with the same width; for another example: the layout of the chat frame of the social software C is that the left side is the avatar of the opposite side and the message frame of the opposite side, and the right side is the avatar of the person and the message frame of the person. The content in the interface abstract is specific information of characters, specific information of pictures and the like. For example: the current temperature of a certain area shown in certain meteorological software.
In this embodiment, a specific implementation manner of step S140 is described by taking the triggering instruction as a voice instruction as an example. Firstly, the received voice instruction needs to be analyzed to obtain the intention and the slot position of the voice instruction. For example: the intention is a quick jump interface (other intention names such as a return interface, a jump interface, a switch interface, and the like can also be used). And acquiring an interface abstract matched with the interface requested by the trigger instruction and context data corresponding to the interface abstract from a buffer according to the slot of the voice instruction. For example: the trigger instruction is "go back to item a detail page for shopping application AXXX item number". The intent to first obtain the voice command is to jump quickly the interface. Then, the slot position for extracting the voice instruction is as follows: "item a and detail page of shopping application a, XXX item number" determines that "shopping application a" in the instruction corresponds to a process in the data structure based on the interface abstract shown in fig. 4 and the data structure of context data corresponding to the interface abstract, and therefore, the process = shopping application a; judging that the detail page in the instruction corresponds to the layout in the data structure, so that the layout = the detail page; it is determined that "article a of the XXX item number" in the command corresponds to "content" in the data structure, and therefore, content = article a of the XXX item number. And acquiring an interface abstract matched with the interface requested by the trigger instruction from a buffer according to the commodity a with the progress = shopping application A, the layout = detail page and the content = XXX cargo number, and acquiring context data corresponding to the interface abstract.
S150: and the terminal jumps to a corresponding interface and displays the interface according to the interface abstract and the context data corresponding to the interface abstract acquired in the step S140.
In addition, if the interface summary and/or the corresponding context data of the interface requested by the trigger instruction are not obtained in step S140, the trigger instruction is not responded.
As an alternative implementation, the process of extracting the interface abstract of the interface in step S130 by using the interface abstract technology is described in detail below, where the interface abstract includes a process and content.
Firstly, identifying the process to which the interface belongs, and storing the process to which the interface belongs in a character string form, for example: fprocessStr (process name). Wherein, the process may be an application name or a software ID package name, for example: the process to which the interface belongs can be photo album, music, shopping application, reading, etc.
Then, the content displayed by the interface is identified, wherein the content comprises the text content and the image content of the interface, and the content displayed by the interface is stored in a character string form, for example: fcontent=Summarizetext(TextView)+Summarizeimage(ImageView). Wherein, FcontentFor the content displayed by the interface, summarize is a summary function and is used for extracting a summary; summarizetextSummarize as a function of abstracting the content of the interface textimageFor the function of extracting the abstract of the interface image content, textView is the text content of the interface, and ImageView is the image content of the interface.
And finally, forming an interface abstract according to the process to which the interface belongs and the hierarchy of the interface content.
As another optional implementation manner, the interface abstract may further include a layout, that is, in this embodiment, the interface abstract includes a process, a layout, and content.
First, the process of identifying the process to which the interface belongs is basically the same as the process of identifying the process to which the interface belongs in the previous embodiment, and therefore, the description thereof is omitted here.
Second, the layout of the interface is identified. In this step, first, an Extensible Markup Language (XML) corresponding to the interface layout needs to be obtained, the maximum depth of the interface layout, the number of image nodes, and the number of text nodes are obtained based on the XML parsing, and the maximum depth of the interface layout, the number of image nodes, and the number of text nodes are stored in a form of a character string, for example: flayout= (Depth (XML), counter (XML, imageView), counter (XML, textView) where FlayoutFor the layout of the interface, depth (XML) is the node Depth obtained by XML parsing, counter (XML, imageView) is the number of image nodes obtained by XML parsing, and Counter (XML, textView) is the number of text nodes obtained by XML parsing.
Then, the content displayed by the interface is identified, and the process is basically the same as the content displayed by the interface in the previous embodiment, so the details are not repeated here.
And finally, forming an interface abstract according to the process to which the interface belongs, the interface layout and the content of the interface.
It should be understood that the steps and technical features in the present embodiment are only examples, and do not limit the embodiments of the present invention. Those skilled in the art may implement the interface switching scheme of the present invention using more or fewer steps/techniques.
It can be understood that, in this embodiment, the interface abstract related in S120 to S150 may also be replaced with other contents, for example, a title of the interface, a keyword of the interface, a hotword of the interface, picture information of the interface, time information of the interface, partial or all text information of the entire interface, and the like, so that the electronic device may match with other information of the interface associated in the voice instruction to obtain context data of the target interface, and thus switch to an interface corresponding to the context data. The interface switching (skipping) can be realized by caching the interface abstract of the interface as an index, and the interface switching can also be realized by other contents of the interface. The embodiment of the present application does not limit this.
Based on the above embodiments, the embodiment of the present invention provides another embodiment as shown in fig. 5. This embodiment is described by taking an example of switching between interfaces of a mobile phone by a user.
S210: the user is browsing the mobile phone, and after viewing the multiple interfaces of the video application P, he views the multiple interfaces of the memo.
In the process that the user views the multiple interfaces and the multiple interfaces of the memo of the video application P, the mobile phone may cache the interface information of the multiple interfaces viewed by the user, for example, when the user opens the corresponding interface, abstract the interface, and cache the abstract/keyword and interface address of the interface. The information of the interface, such as the cache, may be as described in the embodiment shown in fig. 2, and is not described herein again.
For example, the user browses the list of travel items recorded in the memo notes of the mobile phone on day 3/1 in 2021 and the contents recorded on day 14/4 in 2021. The first interface currently displayed by the mobile phone is the content recorded in the memo note at 2021, 4 months and 14 days.
And the mobile phone abstracts the interface and caches the abstract and the context data of the interface.
S220: when an interface corresponding to 14 days in 2021, 4 months and 14 days in the memo note is viewed, the mobile phone receives a first voice trigger instruction. For example, the voice trigger instruction is "open article B in application B". And analyzing the trigger instruction to obtain that the trigger instruction is an instruction for opening an interface. The process of analyzing the trigger instruction is the same as that of step S110 in the above embodiment, and thus is not described herein again.
S230: the mobile phone switches the first interface in the step S210 to an interface corresponding to the commodity B in the application B according to the first voice trigger instruction, and extracts an interface abstract of the interface by using an interface abstract technology.
In one example, the interface summary of the interface is: process = application B, content = commodity B. And storing the interface abstract of the process = application B and the content = commodity B and the context data corresponding to the interface abstract into a buffer of the terminal. Refreshing the interface abstract cached in the cache and the context data corresponding to the interface abstract by utilizing an LRU algorithm, adding the interface abstract corresponding to the interface of the current request and the corresponding context data to the tail of the queue, and deleting the interface abstract corresponding to the interface at the head of the queue and the context data.
S240: and the mobile phone receives a second voice trigger instruction. For example, "go back to movie P of the just browsed video application P".
S250: and analyzing the second voice trigger instruction.
When the mobile phone receives the voice trigger instruction, the trigger instruction is analyzed, the trigger instruction is obtained and is used as an instruction for initiating interface jump, the slot position/parameter of the trigger instruction is obtained and is 'video application P and movie P', and the target process in the trigger instruction is further analyzed and is 'video application P', and the target content is 'movie P'.
S260: and the mobile phone acquires an interface abstract matched with the target process and the target content of the trigger instruction from the buffer, and acquires context data corresponding to the interface abstract. That is, the mobile phone determines the interface abstract matched with the "video application P" and the "movie P", and obtains context data corresponding to the abstract, where the context data is a resource link of the movie P in the video application P.
S270: and switching to a corresponding interface and displaying the interface according to the interface abstract obtained in the step S260 and the context data corresponding to the interface abstract.
Specifically, the mobile phone may obtain content of the movie P according to the resource link of the movie P in the video application P and open a page displaying the movie P in the video application P.
Optionally, when the display is successful, the interface abstract and the corresponding context data cached in the buffer need to be refreshed by using the LRU algorithm, that is, the end of the layer of queue where the abstract of the interface requested this time and the corresponding context data are moved.
And S280, the mobile phone receives the third voice trigger instruction and switches to an interface requested by the third voice trigger instruction. For example, "go back to the memo page recorded at 3/1/2021, or" go back to the list of items in the memo ".
According to a similar process to S240-S270, the cell phone switches from the interface of the movie P in the currently displayed video application P to the page of the list of travel items recorded on 3/1/2021 in the memo application.
It should be understood that the steps and technical features in the present embodiment are only examples, and do not limit the embodiments of the present invention. A person skilled in the art may implement the interface switching scheme of the present invention by using more or fewer steps/technical means.
As described in the foregoing embodiment, the interface summary cached in the embodiment of the present invention may also be replaced with other contents, for example, a title of the interface, a keyword of the interface, a hotword of the interface, image information of the interface, time information of the interface, partial or all text information of the entire interface, and the like, so that the electronic device may match with other information of the interface associated in the voice instruction to obtain context data of the target interface, so as to switch to the interface corresponding to the context data. Namely, the embodiment of the application can not only realize interface switching (skipping) by caching the interface abstract of the interface as an index, but also realize interface switching by other contents of the interface. The embodiment of the present application does not limit this.
Based on the interface switching method provided by the embodiment of the application, the unified modeling of the interface process, the interface layout and the interface content is realized, the interface identification with fine granularity is provided, and the request conflict can be effectively avoided. In addition, the interface switching method provided by the embodiment of the application adopts multiple levels of queues to realize the hierarchical caching of the interface abstract, each level is provided with an independent queue length, and the nodes and node positions of each queue can be updated respectively through an LRU algorithm, so that the caching is more flexible, and the updating of new and old nodes among levels is independent and does not influence each other.
Based on the above embodiments, as shown in fig. 6, a flowchart of an interface switching method according to another embodiment of the present application is provided. The related contents and descriptions of the above embodiments are all applicable to the embodiments, and are not described herein again. The method mainly comprises steps S310-S340, and the following steps are introduced in sequence:
s310: the first interface is displayed.
The first interface is an interface currently displayed on a terminal screen. Optionally, the first interface may be any interface of the terminal, for example: a certain interface of a certain application or a desktop of a terminal, etc.
S320: receiving a trigger instruction, wherein the trigger instruction is used for triggering interface switching; the trigger instruction comprises the progress of the second interface and the content information of the second interface. Wherein the second interface may be an interface opened in history.
As an optional implementation manner, the process of the second interface, the content information of the second interface, and the context data associated with the second interface are stored in a buffer of the terminal in a form of a queue. Specifically, the method comprises the following steps: the data structure form of the process of the second interface, the content information of the second interface, and the specific cache of the context data associated with the second interface is as shown in fig. 3 in the foregoing embodiment, and details thereof are not repeated here.
As another optional implementation manner, the buffer of the terminal stores the layout of the second interface in the form of a queue. Namely: the terminal cache stores the process of the second interface, the layout of the second interface, the content information of the second interface and the context data associated with the second interface in a queue mode. Specifically, the data structure form of the specific cache is as shown in fig. 4 in the foregoing embodiment, and details thereof are not repeated here.
S330: and acquiring context data associated with the second interface according to the process of the second interface and the content information of the second interface contained in the trigger instruction. The context data associated with the second interface is a link for jumping to the second interface or address information of the second interface.
Specifically, the specific implementation manner of this step may refer to step S140 in the foregoing embodiment, and details thereof are not described here.
S340: switching from the first interface to the second interface displayed according to the context data associated with the second interface.
Specifically, the specific implementation manner of this step may refer to step S150 in the foregoing embodiment, so that the terminal switches from displaying the first interface to displaying the second interface, which is not described in detail in this embodiment.
In the embodiment of the application, if the related information of the second interface requested to be opened by the trigger instruction does not exist in the buffer of the terminal, the second interface is requested to the server, when the second interface requested by the trigger instruction is opened for the first time, the process, the content information and the context data matched with the interface are extracted, and the extracted process, the extracted content information and the extracted context data matched with the interface are stored. In this step, the process of extracting the process, the content information, and the context data that are matched with the interface is basically the same as the process of extracting the interface abstract of the interface in step S120 in the above embodiment; the process, the content information, and the context data that are obtained by extracting and are matched with the interface are basically the same as the process of storing the interface abstract obtained by extracting and the context data corresponding to the interface abstract in the buffer in step S130 in the above embodiment, and therefore, the process is not described herein again.
Another embodiment of the present application further provides an interface switching apparatus, which may be implemented by a software system, a hardware device, or a combination of a software system and a hardware device.
It should be understood that fig. 7 is a schematic structural diagram illustrating an electronic device by way of example only, and the present application does not limit the division of the functional modules in the interface switching device. As shown in fig. 7, the electronic device may be logically divided into a plurality of modules, each of which may have different functions, the functions of each module being implemented by instructions in a memory that may be read and executed by a processor in the computing device. Illustratively, the interface switching device includes a display module 610, a receiving module 620, an obtaining module 630 and a switching module 640.
In one embodiment, the interface switching device is configured to perform the operations described in steps S310-S340 shown in fig. 6. Specifically, the method can be as follows: the display module 610 is configured to display a first interface. A receiving module 620, configured to receive a trigger instruction, where the trigger instruction is used to trigger interface switching; the trigger instruction comprises the process of the second interface and the content information of the second interface. The obtaining module 630 is configured to obtain context data associated with the second interface according to the process of the second interface and the content information of the second interface included in the trigger instruction. A switching module 640, configured to switch from the displayed first interface to the second interface according to the context data associated with the second interface. It should be noted that, in the embodiments of the present application, only the structural and functional modules of the interface switching device are exemplarily divided, but no limitation is imposed on the specific division thereof.
For example, the display module 610 in this embodiment may be the display screen 173 in the terminal 100 shown in fig. 1, and the first interface or the second interface in this embodiment is displayed through the display screen 173. The receiving module 620 in this embodiment may be the audio module 160 in the terminal 100 shown in fig. 1, and receive the trigger instruction through the audio module 160, and convert the trigger instruction (digital audio information) into an analog audio signal and transmit the analog audio signal to the obtaining module 630. The obtaining module 630 and the switching module 640 in this embodiment may be the processor 110 in the terminal 100 shown in fig. 1, and are used to implement interface switching. In addition, the program instructions in the present embodiment may each be stored in the memory 120 in the terminal 100 shown in fig. 1. It is to be understood that the present application is not limited to the division of the modules in the boundary device, and may also be implemented in other manners.
Optionally, the second interface is an interface opened in history; the cache stores the process of the second interface, the content information of the second interface and the context data associated with the second interface.
In one implementation, the process of the second interface, the content information of the second interface, and the context data associated with the second interface are cached in the cache in the form of a queue, respectively.
In one implementation, the triggering instruction further includes: a layout of the second interface.
Optionally, the context data associated with the second interface is a link to jump to the second interface or address information of the second interface.
Optionally, after the switching from the displayed first interface to the displayed second interface, the method further includes:
refreshing the data stored in the cache using a least recently used algorithm.
In one implementation manner, when a second interface requested by the trigger instruction is opened for the first time, the process, the content information and the context data matched with the interface are extracted, and the extracted process, the content information and the context data matched with the interface are stored.
Optionally, when the trigger instruction is a voice trigger instruction, slot position information of the trigger instruction is obtained in a voice recognition mode; determining a progress of the second interface, content information of the second interface and context data associated with the second interface according to the slot position information.
In one implementation, the second interface is a network interface opened by interaction with a server; or, the second interface is a local interface.
The specific implementation manner of each functional module in this embodiment may refer to the description in the foregoing method embodiment, and this embodiment will not be described again.
Another embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the memory stores executable codes, and the processor executes the executable codes, so as to implement the interface switching method provided in the foregoing embodiment.
The present embodiments also provide a computer-readable storage medium, on which a computer program is stored, the program being used for executing a diversification problem generation method when executed by a processor, the method including at least one of the solutions described in the above embodiments.
The computer storage media of the embodiments of the present application may take any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It should be noted that the foregoing is only illustrative of the preferred embodiments of the present application and the technical principles employed. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, although the present application has been described in more detail through the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present application, which all fall within the scope of the present application.

Claims (19)

1. An interface switching method is characterized by comprising the following steps:
displaying a first interface;
receiving a trigger instruction, wherein the trigger instruction is used for triggering interface switching; the trigger instruction comprises the process of the second interface and the content information of the second interface;
acquiring context data associated with the second interface according to the process of the second interface and the content information of the second interface contained in the trigger instruction;
switching from the first interface to the second interface displayed according to the context data associated with the second interface.
2. The method of claim 1, wherein the second interface is a historically opened interface; the cache stores the process of the second interface, the content information of the second interface and the context data associated with the second interface.
3. The method of claim 2, wherein the process of the second interface, the content information of the second interface, and the context data associated with the second interface are cached in the cache in the form of a queue, respectively.
4. The method of claim 1, wherein the triggering instructions further comprise: a layout of the second interface.
5. The method of claim 1, wherein the context data associated with the second interface is a link to jump to the second interface or address information of the second interface.
6. The method of claim 2, wherein after the switching from the first interface to the second interface being displayed, further comprising:
refreshing the data stored in the cache using a least recently used algorithm.
7. The method according to claim 1, characterized in that when the second interface requested by the trigger instruction is opened for the first time, the process, the content information and the context data matched with the interface are extracted, and the extracted process, the content information and the context data matched with the interface are stored.
8. The method according to claim 1, wherein when the trigger instruction is a voice trigger instruction, slot position information of the trigger instruction is acquired in a voice recognition mode; determining a progress of the second interface, content information of the second interface, and context data associated with the second interface according to the slot position information.
9. The method of claim 1, wherein the second interface is a web interface opened by interaction with a server; or, the second interface is a local interface.
10. The method of claim 9, wherein the second interface is a shopping item correlation interface.
11. The method of claim 9, wherein the second interface is a photo album related interface.
12. An electronic device, comprising:
the display module is used for displaying a first interface;
the receiving module is used for receiving a trigger instruction, and the trigger instruction is used for triggering interface switching; the trigger instruction comprises the process of the second interface and the content information of the second interface;
the obtaining module is used for obtaining context data associated with the second interface according to the process of the second interface and the content information of the second interface contained in the trigger instruction;
a switching module for switching from the displayed first interface to the second interface according to the context data associated with the second interface.
13. The electronic device of claim 12, wherein the second interface is a historically opened interface; the cache stores the process of the second interface, the content information of the second interface and the context data associated with the second interface.
14. The electronic device of claim 13, wherein the process of the second interface, the content information of the second interface, and the context data associated with the second interface are cached in the cache in the form of a queue, respectively.
15. The electronic device of claim 12, wherein the triggering instructions further comprise: a layout of the second interface.
16. The electronic device of claim 12, wherein after the switching from the displayed first interface to the displayed second interface, further comprising:
and refreshing the data stored in the cache by using a least recently used algorithm.
17. An electronic device, comprising a memory and a processor, wherein the memory stores executable codes, and the processor executes the executable codes to implement the interface switching method according to any one of claims 1 to 11.
18. A computer-readable storage medium having stored thereon program instructions, which, when executed by a computer, cause the computer to execute the interface switching method of any one of claims 1 to 11.
19. A computer program product, which, when run on a computing device, causes the computing device to perform the interface switching method of any one of claims 1 to 7.
CN202110483579.1A 2021-04-30 2021-04-30 Interface switching method and electronic equipment Pending CN115268736A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110483579.1A CN115268736A (en) 2021-04-30 2021-04-30 Interface switching method and electronic equipment
PCT/CN2022/085396 WO2022228066A1 (en) 2021-04-30 2022-04-06 Interface switching method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110483579.1A CN115268736A (en) 2021-04-30 2021-04-30 Interface switching method and electronic equipment

Publications (1)

Publication Number Publication Date
CN115268736A true CN115268736A (en) 2022-11-01

Family

ID=83744676

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110483579.1A Pending CN115268736A (en) 2021-04-30 2021-04-30 Interface switching method and electronic equipment

Country Status (2)

Country Link
CN (1) CN115268736A (en)
WO (1) WO2022228066A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140082343A1 (en) * 2012-09-14 2014-03-20 Panasonic Corporation Information processing apparatus
CN107590281A (en) * 2017-09-29 2018-01-16 惠州Tcl移动通信有限公司 Control method, mobile terminal and the storage medium that a kind of webpage is switched fast
CN107729098A (en) * 2017-09-25 2018-02-23 北京小米移动软件有限公司 Method for displaying user interface and device
CN110286976A (en) * 2019-05-24 2019-09-27 华为技术有限公司 Interface display method, device, terminal and storage medium
WO2020233556A1 (en) * 2019-05-20 2020-11-26 华为技术有限公司 Call content processing method and electronic device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113485813A (en) * 2021-07-26 2021-10-08 维沃移动通信有限公司 Application skipping method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140082343A1 (en) * 2012-09-14 2014-03-20 Panasonic Corporation Information processing apparatus
CN107729098A (en) * 2017-09-25 2018-02-23 北京小米移动软件有限公司 Method for displaying user interface and device
CN107590281A (en) * 2017-09-29 2018-01-16 惠州Tcl移动通信有限公司 Control method, mobile terminal and the storage medium that a kind of webpage is switched fast
WO2020233556A1 (en) * 2019-05-20 2020-11-26 华为技术有限公司 Call content processing method and electronic device
CN110286976A (en) * 2019-05-24 2019-09-27 华为技术有限公司 Interface display method, device, terminal and storage medium

Also Published As

Publication number Publication date
WO2022228066A1 (en) 2022-11-03

Similar Documents

Publication Publication Date Title
WO2021139768A1 (en) Interaction method for cross-device task processing, and electronic device and storage medium
JP2023553101A (en) Live streaming interaction methods, apparatus, devices and media
WO2021000841A1 (en) Method for generating user profile photo, and electronic device
CN105740263B (en) Page display method and device
WO2022100221A1 (en) Retrieval processing method and apparatus, and storage medium
WO2020239001A1 (en) Humming recognition method and related device
WO2021180109A1 (en) Electronic device and search method thereof, and medium
CN114679610A (en) Screen projection method, device and system for continuously playing video
CN113391743B (en) Display method and electronic equipment
WO2020062014A1 (en) Method for inputting information into input box and electronic device
WO2021104175A1 (en) Information processing method and apparatus
WO2023179490A1 (en) Application recommendation method and an electronic device
CN115268736A (en) Interface switching method and electronic equipment
CN117116264A (en) Voice recognition method, electronic equipment and medium
CN115086888B (en) Message notification method and device and electronic equipment
WO2022057764A1 (en) Advertisement display method and electronic device
CN113836343A (en) Audio recommendation method and device, electronic equipment and storage medium
CN114417052A (en) Interface display method, intelligent terminal and storage medium
CN115083401A (en) Voice control method and device
CN115271775A (en) Advertisement display method and related equipment
CN117097793B (en) Message pushing method, terminal and server
WO2024067216A1 (en) Application recommendation method and electronic device
WO2022161132A1 (en) Voice broadcasting method and apparatus
WO2023078221A1 (en) Language translation method and electronic device
WO2024087202A1 (en) Search method and apparatus, model training method and apparatus, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination