CN116301510A - Control positioning method and electronic equipment - Google Patents
Control positioning method and electronic equipment Download PDFInfo
- Publication number
- CN116301510A CN116301510A CN202210010061.0A CN202210010061A CN116301510A CN 116301510 A CN116301510 A CN 116301510A CN 202210010061 A CN202210010061 A CN 202210010061A CN 116301510 A CN116301510 A CN 116301510A
- Authority
- CN
- China
- Prior art keywords
- page
- node
- control
- path
- electronic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 64
- 238000004891 communication Methods 0.000 claims description 35
- 230000006854 communication Effects 0.000 claims description 35
- 238000004590 computer program Methods 0.000 claims description 11
- 230000000694 effects Effects 0.000 description 125
- 238000007726 management method Methods 0.000 description 38
- 239000012634 fragment Substances 0.000 description 29
- 230000006870 function Effects 0.000 description 28
- 239000010410 layer Substances 0.000 description 28
- 238000010586 diagram Methods 0.000 description 13
- 230000008569 process Effects 0.000 description 13
- 238000012545 processing Methods 0.000 description 13
- 230000005236 sound signal Effects 0.000 description 13
- 238000010295 mobile communication Methods 0.000 description 12
- 210000000988 bone and bone Anatomy 0.000 description 9
- 238000013528 artificial neural network Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 229920001621 AMOLED Polymers 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 230000003416 augmentation Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000001771 impaired effect Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 101100384355 Mus musculus Ctnnbip1 gene Proteins 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000010411 cooking Methods 0.000 description 1
- 239000012792 core layer Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 239000010985 leather Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000002138 osteoinductive effect Effects 0.000 description 1
- 230000010349 pulsation Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000010845 search algorithm Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000008719 thickening Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Telephone Function (AREA)
Abstract
The embodiment of the application discloses a control positioning method and electronic equipment, relates to the field of terminals, and provides a rapid and convenient method for positioning a control in application and opening a page corresponding to the control, so that a user can control the electronic equipment rapidly through voice. The specific scheme is as follows: the electronic equipment acquires first text information, wherein the first text information comprises a first keyword; the electronic equipment traverses controls in the application according to the first text information, and matches the controls to obtain target controls, wherein the target controls correspond to the first keywords; the electronic equipment acquires a path from the current page to the target control, and determines that the current page reaches the target control; and opening a page corresponding to the target control by the electronic equipment.
Description
Cross Reference to Related Applications
The present application claims priority from the chinese patent office, application number 202111559609.9, application name "a control positioning method and electronic device" filed on day 20, 12, 2021, the entire contents of which are incorporated herein by reference.
Technical Field
The scheme relates to the technical field of terminals, in particular to a control positioning method and electronic equipment.
Background
Current electronic devices support a voice assistant function by which a user may first issue a voice control command to the electronic device to open an application on the electronic device. The voice assistant is adopted to operate the electronic equipment, so that the difficulty of using the electronic equipment by a user can be reduced, convenience is provided for visually impaired people, and meanwhile, the operation requirement of the user on the electronic equipment can be met under the scene that some users cannot vacate both hands to operate the electronic equipment.
Currently, a voice assistant can only recognize some specific intentions of a user, such as operations of "opening a clock", "opening a stopwatch", "opening a calendar", etc., which can only help the user to open a specific application, so that the operation when the user uses the voice assistant to control an electronic device is limited.
Disclosure of Invention
The embodiment of the application provides a control positioning method, which realizes that a user jumps to different pages in an application in a voice control mode, and effectively reduces the difficulty of using electronic equipment by the user.
In a first aspect, the present application provides a control positioning method, including: the electronic equipment displays a first page of a first application, receives a first instruction, instructs to open a second page corresponding to the target control, determines a step for opening the second page corresponding to the target control, and opens and displays the second page corresponding to the target control according to the step.
By the method, the page to be opened can be opened quickly in the application, the complexity of searching for the page opening entry by the user is reduced, and a convenient page opening mode is provided for the user.
In one possible implementation, the electronic device sequentially opens and displays a third page and a second page, the third page including the target control.
In one possible implementation, the electronic device displays the third page after opening the third page and before opening the second page.
In one possible implementation, the first instruction includes a first voice instruction; the method further comprises the following steps: the electronic equipment converts the first voice instruction into a first text; and the electronic equipment determines a target control according to the first text, wherein the first text comprises a first keyword, and the first keyword is matched with the target control.
In one possible implementation, the electronic device plays first voice information, where the first voice information is used to indicate that the electronic device has opened a second page corresponding to the target control.
In one possible implementation manner, the electronic device determines a second page corresponding to the target control according to a control page tree, where the control page tree includes a control node and a page node, and a relationship between the control node and the page node, the control node corresponds to a control included in the first application, and the page node corresponds to a page included in the first application.
In one possible implementation manner, the electronic device determines a path from a node of a first page to a node of a target control according to a control page tree, wherein the first page node is a node corresponding to the first page, and the target control node is a node corresponding to the target control; and the electronic equipment determines to open a second page corresponding to the target control according to the path from the first page node to the target control node.
In one possible implementation manner, the electronic device acquires a first path from a home page node to a target control node and a second path from the home page node to the first page node, wherein the home page node is a node corresponding to a home page of an application; if the first path comprises the second path, determining a path from the first page node to the target control node through the child node of the first page node.
In one possible implementation manner, the electronic device acquires a first path from a home page node to a target control node and a second path from the home page node to the first page node, wherein the home page node is a node corresponding to a home page of an application; if the first path does not comprise the second path, determining a path from the first page node to the target control node through a third page node, wherein the third page node is the bottommost node in the common path of the first path and the second path.
In a second aspect, the present application provides an electronic device, comprising: one or more processors, memory, touch screen, and one or more computer programs; one or more processors, memory, touch screen, and one or more computer programs are connected by one or more communication buses; the touch screen includes a touch surface and a display, one or more computer programs stored in the memory and configured to be executed by the one or more processors; the one or more computer programs include instructions for performing the control positioning method of any of the first aspects.
In a third aspect, the present application provides a computer storage medium comprising computer instructions that, when run on an electronic device, cause the electronic device to perform the control positioning method of any one of the first aspects.
In a fourth aspect, the present application provides a computer program product for, when run on a computer, causing the computer to perform the control positioning method as in any one of the first aspects.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic diagram of a software architecture according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an architecture in an application according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of an application page according to an embodiment of the present application;
FIG. 5 is a schematic diagram of another application page according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a data structure according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a control page tree according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of another control page tree provided in an embodiment of the present application;
fig. 9 is a flowchart of a control positioning method provided in an embodiment of the present application;
FIG. 10 is a flowchart of another control positioning method according to an embodiment of the present disclosure;
FIG. 11 is a flowchart of a control positioning method according to an embodiment of the present disclosure;
FIG. 12A is a schematic diagram of a page according to an embodiment of the present disclosure;
FIG. 12B is a schematic illustration of a further page provided in an embodiment of the present application;
Detailed Description
The technical solutions in the embodiments of the present application will be described in detail below with reference to the drawings in the following embodiments of the present application.
In general, a user may perform operation control on an electronic device by means of touch control, key press, and the like when operating the electronic device. For example, a user may touch a click on an application icon on the electronic device touch display screen to open the application. However, controlling the electronic device by means of the above-mentioned touch control, key press, etc. may bring operational inconvenience to the user in some special cases. For example, when a user cannot vacate both hands during driving, cooking, etc., the electronic device is not controlled by touch control or other modes; for another example, when the user is a visually impaired person, the electronic device is not controlled by touch control or other modes. At this time, the user may control the electronic device through the voice assistant, for example, the user may accept a voice control command sent by the user through the voice assistant to implement control of the electronic device.
Currently, a function of opening an application on an electronic device may be implemented by a voice assistant, for example, after a user activates the voice assistant, a voice command of "clock open" is sent to the electronic device, and the electronic device may parse out the intention of the user and open the clock application. However, when the user is already opening a certain application, a page corresponding to a certain control in the application cannot be opened by the voice assistant, for example, when the user has already opened the clock application, a page corresponding to an "add alarm clock" control in the clock application cannot be opened by the voice assistant.
In view of this, the embodiment of the application provides a control positioning method, which implements control positioning in an application through a voice assistant and opens a page corresponding to the control through a dependency relationship between different pages and controls established in a control search service built in the application, so that a user can jump to different pages in the application in a voice control manner conveniently, and the difficulty of using electronic equipment by the user is effectively reduced.
It should be understood that an application program (abbreviated application) to which an embodiment of the present application relates is a software program capable of implementing some or more specific functions. Typically, a plurality of applications may be installed in an electronic device. Such as camera applications, text messaging applications, mailbox applications, video applications, music applications, etc. The application mentioned below may be an application installed when the electronic device leaves the factory, or may be an application downloaded from a network or acquired by a user from other electronic devices during use of the electronic device.
It should be noted that the display method provided in the embodiments of the present application may be applicable to any electronic device having a display screen, such as a mobile phone, a tablet computer, a wearable device (e.g., a watch, a bracelet, a smart helmet, a smart glasses, etc.), a vehicle-mounted device, an augmented reality (augmented reality, AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA), etc., which are not limited in the embodiments of the present application. The electronic device according to the embodiment of the present application may be a foldable electronic device, such as a foldable mobile phone, a foldable tablet computer, etc., which is not limited in this application. Also, exemplary embodiments of the electronic device include, but are not limited to, piggybackingHong Mong->Or other operating system electronic devices.
The structure of the electronic device will be described below using a mobile phone as an example.
As described in fig. 1, fig. 1 shows a first structural schematic diagram of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present invention is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. Pressure sensor 180A
Such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
It will be appreciated that the components shown in fig. 1 are not intended to be limiting in detail, and that the handset may also include more or fewer components than shown, or may be combined with certain components, or may be split into certain components, or may be arranged in different components. In the following embodiments, a mobile phone 100 shown in fig. 1 will be described as an example.
The software system of the mobile phone 100 may adopt a layered architecture, including an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In this embodiment, taking an Android system with a layered architecture as an example, a software structure of the mobile phone 100 is illustrated. It should be understood that the system in the embodiments of the present application may also be a hong system, which is not limited in this application.
The software architecture of the electronic device is described below in connection with different scenarios. Fig. 2 is a software configuration block diagram of the mobile phone 100 according to the embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into five layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android time, ART) and native C/c++ libraries, a hardware abstraction layer (hardware abstract layer, HAL), and a kernel layer, respectively.
The application layer may include a series of application packages. As shown in fig. 2, the application package may include applications such as cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, voice assistants, etc.
The application framework layer provides an application programming interface (appl icat ion programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a resource manager, a notification manager, an activity manager, an input manager, and so forth.
The window manager provides, among other things, window management services (window manager service, WMS) that may be used for window management, window animation management, surface management, and as a transfer station to an input system.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The activity manager may provide activity management services (Act ivity Manager Service, AMS) that may be used for system component (e.g., activity, service, content provider, broadcast receiver) start-up, handoff, scheduling, and application process management and scheduling tasks.
The input manager may provide input management services (Input Manager Service, IMS), which may be used to manage inputs to the system, such as touch screen inputs, key inputs, sensor inputs, and the like. The IMS retrieves events from the input device node and distributes the events to the appropriate windows through interactions with the WMS.
The android runtime includes a core library and An Zhuoyun rows. The android runtime is responsible for converting source code into machine code. Android runtime mainly includes employing Advanced Or Time (AOT) compilation techniques and Just In Time (JIT) compilation techniques.
The core library is mainly used for providing the functions of basic Java class libraries, such as basic data structures, mathematics, IO, tools, databases, networks and the like. The core library provides an API for the user to develop the android application. .
The native C/c++ library may include a plurality of functional modules. For example: surface manager (surface manager), media Framework (Media Framework), libc, openGL ES, SQLite, webkit, etc.
The surface manager is used for managing the display subsystem and providing fusion of 2D and 3D layers for a plurality of application programs. Media frames support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. OpenGL ES provides for drawing and manipulation of 2D graphics and 3D graphics in applications. SQLite provides a lightweight relational database for applications of the electronic device 100.
The hardware abstraction layer runs in a user space (user space), encapsulates the kernel layer driver, and provides a call interface to the upper layer.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the electronic device 100 software and hardware is illustrated below based on fig. 1 and 2. When touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including the coordinates of the touch operation). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the input event. Taking this touch operation as an example of a sliding operation, when the touch sensor 180K receives the sliding operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer may process the sliding operation into the original input event and store the event. The event may include, among other things, a start position coordinate of the sliding operation, an end position coordinate of the sliding operation. The application framework layer obtains the original input event from the kernel layer and identifies the input event.
Furthermore, at least one of the following embodiments is directed to, including one or more; wherein, a plurality refers to greater than or equal to two. In addition, it should be understood that in the description of this application, the words "first," "second," and the like are used merely for distinguishing between the descriptions.
Fig. 3 is an application architecture diagram of an application a on an electronic device, where the application a may include a control search service (Widget Search Service), a plurality of activities, a plurality of fragments, and a plurality of views, as shown. It will be appreciated that fig. 3 is only illustrated by taking application a as an example, and the embodiment of the present invention is not limited to the type of application, for example, application a may be a system application (i.e. applications such as "setup", "clock", "album" and the like of the mobile phone), and may also be a third party application (such as applications such as "WeChat", "Payment treasure").
The control search service (Widget Search Service) module is a module built in the application A, can receive keywords transmitted from the voice assistant application, traverses keywords corresponding to the controls through the dependency relationship between the controls and the page, and matches the keywords with the target controls, so that the controls in the application A are searched and positioned. Specific traversal and matching procedures are described in detail in the examples below. The control inquiry module is also used for establishing, updating and storing the dependency relationship between the control and the page.
Activity is a page contained in an application, fragment is a sub-page contained in the application, and View is a control contained in the application. There may be multiple activities, multiple fragments, and multiple views in an application, where the views may be contained within an Activity or Fragment and the views may contain links that open an Activity or Fragment. Thus, when a certain Activity or Fragment in an application is located, the View contained in the Activity or Fragment can be clicked, so that the Activity or Fragment linked by the View is opened.
In particular, reference may be made to the application interface schematic diagrams of fig. 4 and 5.
As shown in fig. 4, in the application, one or more pages may be included, for example, in the setting application, an Activity a, an Activity B, and an Activity C may be included, and in each page, one or more controls may be included, for example, in the Activity a, a control View A1 corresponding to a search box "search setting item", a control View A3 corresponding to an option "notification", a control View A4 corresponding to an option "application and service", and a control View A5 corresponding to an option "battery" are included. For example, the Activity B includes a control View B1 corresponding to the option "application management", a control View B2 corresponding to the option "application start management", a control View B3 corresponding to the option "service management", and a control View B4 corresponding to the option "rights management"; for example, the Activity C includes a control View C1 corresponding to the search box "search application", a control View C2 corresponding to the option "all application management", a control View C3 corresponding to the switch of the option "all application management", a control View C4 corresponding to the option "clear link", and a control View C5 corresponding to the line selection "clear link" switch.
As shown in FIG. 5, in the interface of the application, one or more sub-pages may be included, for example, sub-page fragments A, B, C, D may be included in the page of the gallery application, each sub-page may further include one or more controls, for example, fragment B includes a control View 5 corresponding to all photos in the option field, a control View 6 corresponding to video in the option field, a control View 7 corresponding to screen shot in the option field, a control View 8 corresponding to View details in the option field, and Fragment D includes a control View 9 corresponding to micro movie production in the option field, a control View 10 corresponding to free View in the option field, a control View 11 corresponding to jigsaw puzzle in the option field, and a control View 12 corresponding to entering a new image in the link.
It should be explained that, in the application, the control has a corresponding page, that is, a certain page can be opened by operating a certain control, and the page is the page corresponding to the control.
The control search service is internally provided with a control search algorithm, and the control search and positioning can be completed through the algorithm. In order to facilitate matching to the most accurate control based on text information delivered by the voice assistant, keywords (keywords) may be defined for the controls contained within the application to match based on the text information and keywords for each control. The keyword may be a text description or name of a control, such as the text description "sound and vibration" of control View A2 in FIG. 4; the keywords may also be defined otherwise, such as adding a description of a field in the control class. It will be appreciated that the keywords may be text in chinese, english or other languages, and the invention is not limited in particular.
In some embodiments, a field may be defined in the class of the control as a Keyword, through which the instantiated object may be assigned a Keyword. As shown in fig. 6, a Keyword field is defined in the View class, and may be inherited in the subclass of the View, for example, may be inherited in the subclass Button, textView, imageView and other subclasses.
When an application is published, a dependency relationship between the page and the control may be established, which may be a tree structure, such as a control page tree. Specifically, for building a control page tree, the parent-child relationship between the control and the page can be defined as follows: (1) If a certain control W is operated and can jump to the page P, the control W is a father node of the page P; (2) If the control W1, W2, W3 … is contained within page P, then page P is the parent of W1, W2, W3 …. It can be understood that the relationship between the control node and the page node is interdependent, that is, the control node can be used as a parent node of the page node or as a child node of the page node, and meanwhile, the page node can be used as a parent node of the control node or as a child node of the control node.
For example, in the "setting" application, as in fig. 4, the pages may include an Activity a, an Activity B, and an Activity C, where the user may jump to the Activity B by clicking a control View A3 located in the Activity a, the Activity a corresponds to a main setting page of the "setting" application, the Activity B corresponds to a setting page of "application and service" of the "setting" application, and the control View A3 corresponds to an "application and service" setting item on the main setting page. Based on the jump relation and the (2) th defined for the father-son relation of the control and the page, the page Activity A can be defined as a father node of the control View A3, and accordingly, the control View A3 is a child node of the page Activity A; based on the jump relation and the above (1) defined parent-child relation between the control and the page, the control View A3 may be defined as a parent node of the page Activity B, and accordingly, the page Activity B is a child node of the control View A3. Thus, a control page tree for the "set" application is built according to the above-described jump relationship, as shown in fig. 7.
In some embodiments, a sub-page (Fragment) may also be included in the application, and the sub-page may also establish a parent-child node relationship with the control. For example, in the "album" application as in fig. 5, the main page Activity a, sub page fragments a, B, C, and D, and the controls View1, view 2, view 3, and View4 may be included, where the sub page Fragment a corresponds to a "photo" page, the sub page Fragment B corresponds to a "album" page, the sub page Fragment C corresponds to a "time" page, the sub page Fragment D corresponds to a "find" page, the control View1 corresponds to a "photo" option in a bottom page option column of the main page, the control View 3 corresponds to a "time" option in a bottom page column of the main page, and the control View4 corresponds to a "album" option in a bottom page column of the main page. The user can jump to the sub-page Fragment A corresponding to the photo page by clicking the control View1 in the Activity A, and the user can jump to the sub-page Fragment B corresponding to the photo album page by clicking the control View 2 in the Activity A; the user can jump to sub-page Fragment C corresponding to the 'moment' page by clicking a control View 3 positioned in the Activity A; the user can also jump to sub-page Fragment D corresponding to the "found" page by clicking on control View4 located in Activity A. Based on the jump relation and the (2) th defined for the father-son relation of the control and the page, the page Activity A can be defined as father nodes of the controls View1, view 2, view 3 and View4, and accordingly, the controls View1, view 2, view 3 and View4 are child nodes of the page Activity A; based on the jump relation and the definition of the parent-child relation between the control and the page in item (1), the control View1 may be defined as a parent node of the child page Fragment A, the control View 2 may be defined as a parent node of the child page Fragment B, the control View 3 may be defined as a parent node of the child page Fragment C, the control View4 may be defined as a parent node of the child page Fragment D, and accordingly, the child page Fragment A may be a child node of the control View1, the child page Fragment B may be a child node of the control View 2, the child page Fragment C may be a child node of the control View 3, and the child page Fragment D may be a child node of the control View 4. Thus, a control page tree for the "album" application is built according to the above-described jump relationship, as shown in fig. 8.
It will be appreciated that fig. 7 and 8 are merely illustrative of the dependency of controls on pages within an application, and that the size of the control page tree may be greater than the size of fig. 7 or 8. In general, when a parent node is a page or a child page, the child node is a control; when the parent node is a control, the child node is a page or a child page. When the father node is a page or a sub-page, a plurality of controls can be used as the sub-nodes; when a parent node is a control, there is typically only one page or child page as the child page of the control. The control page tree can be dynamically updated after being established, for example, the relationship between the parent node and the child node in the tree structure can be changed or updated according to the clicking habit of the user. The root node in the control page tree is usually the highest-level page in the application, for example, the main setting page in the "setting" application corresponds to the Activity a in fig. 7, and the root node in the control page tree is also the main page in the "album" application corresponds to the Activity a in fig. 8. The control page tree can be stored in the application local of the electronic device in a structure of a doubly linked list, and the type of the data structure is not particularly limited in the embodiment of the invention.
A method for positioning a control is specifically described below with reference to fig. 9, where the method specifically includes:
Step 901: the electronic device obtains first text information, wherein the first text information comprises a first keyword.
Specifically, the electronic device may acquire first text information, where the first text information may be text information that is input into the electronic device by a user through a typing manner, or may be text information that is acquired by the electronic device by identifying acquired voice information of the user. The first keyword may include a text description corresponding to a control that the user wants to interact with, or a text similar to the text description, such as "album", "sound and vibration", and so on.
In some embodiments, the user may open the first application and display a first page of the first application before the electronic device obtains the first text information. On a first page of a first application, the electronic device may obtain first text information.
Step 902: and the electronic equipment traverses all the controls in the application according to the first text information, and matches to obtain a target control, wherein the target control corresponds to the first keyword.
Specifically, the target control corresponds to the first keyword, which means that the keyword of the target control corresponds to the first keyword. See above for descriptions of keywords for target controls. And if the keyword of the control corresponds to the first keyword, determining that the control is a matched target control. It will be appreciated that the keywords of the target control are identical, in whole or in part, to the first keywords described above. For example, the keyword of the control is "album", the first text information includes a first keyword "album", and the keyword of the control is the same as the first keyword, so that the control can be determined to be a target control; for example, if the keyword of the control is "sound and vibration", the first text information includes the first keyword "vibration", and the keyword of the control is partially the same as the first keyword, it may also be determined that the control is a target control.
It should be noted that, when the keyword of the target control and the first keyword portion are the same, different matching strategies may be adopted, such as prefix matching or suffix matching, and a relatively optimal target control may be selected from multiple matched controls through different matching strategies. In some embodiments, matching may occur when matching is performed to more than one or a preset number of optimal target controls, where it may be considered that a specific control cannot be accurately located from the first text information, so that matching may be terminated, searching may be returned to fail, and execution of subsequent steps may not be performed.
Step 903: and the electronic equipment acquires a path from the current page node to the target control node and determines that the path reaches the target control node.
Specifically, the electronic device obtaining the path from the current page node to the target control node includes obtaining the path from the current page node to the target control node according to the dependency relationship between the control pages. For example, a path from a current page node to a target control node may be determined from parent-child node relationships of the control page tree.
For example, taking a "setting" application as an example, as shown in fig. 7, if the current page is Activity B and the currently matched target control is View B4, a path from the current page Activity B node to the target control View B4 node may be determined to be Activity B- > View B4 according to the control page tree. At this time, the step of determining to reach the target node only needs one step, namely, the target control node can be reached through one step of the path. As shown in fig. 7, if the current interface is still Activity B, and the currently matched target control is View C5, a path from the current page Activity B node to the target control View C5 node may be determined to be Activity B- > View B2- > Activity C- > View C5 according to the control page tree. At this time, the step of determining to reach the target node needs three steps, namely, three steps are needed to reach the target control node through the path.
Step 904: and the electronic equipment opens the page corresponding to the target control node.
Specifically, after determining a path from the current page node to the target control, the target control node may be reached according to the steps. And opening the page corresponding to the target node according to the child node connected with the target node.
For example, still taking the "set" application as an example, as shown in fig. 4 and fig. 7, if the page where the application is currently located is Activity B, the currently matched target control is View B2, and the text of the control of View B2 is described as "application start management", which corresponds to page Activity C where "application start management" can be opened. According to the parent-child relationship of the control page tree, the child node with the Activity C of View B2 can be obtained, so that the corresponding page child node can be found through the target control node, and the page is opened.
By the method, positioning of a specific control in the application can be achieved, the relative position of a target control node to a current page node can be found in a matching mode according to the established dependency relationship between the control and the page, the path from the current page node to the target control node is obtained, and therefore the target control is found and the page corresponding to the control is opened. By using the scheme, the user can be quickly and conveniently helped to find the control which can be interacted in the application, and the page corresponding to the control is opened.
The embodiment of the application also provides a method for positioning the control, as shown in fig. 10, which specifically includes:
step 1001: the electronic device obtains first text information, wherein the first text information comprises a first keyword.
Step 1002: and the electronic equipment traverses all the controls in the application according to the first text information, and matches to obtain a target control, wherein the target control corresponds to the first keyword.
The above step 1001 and step 1002 are the same as the step 901 and step 902 in fig. 9, and are not described here again.
Step 1003: and respectively acquiring a target control path from the root node to the target control node and a current page path from the root node to the current page node.
Specifically, the root node refers to the node at the top of the control page tree, and is usually the node corresponding to the main page or the head page in one application, or the node of the page displayed by default when one application is opened. For example, the Activity A node in FIG. 7 corresponds to a main settings page of the "settings" application; for another example, the Activity A node in FIG. 8 corresponds to a page displayed at default open in the "album" application.
The target control path refers to a path from the root node to the target control node; the current page path refers to the path from the root node to the current page node. For example, as shown in fig. 7, if the current page is Activity B and the target control is View B2, the current page path is Activity a- > View A4- > Activity B and the target control path is Activity a- > View A4- > Activity B- > View B2. For example, as shown in fig. 7, if the current page is Activity B and the target control is View A1, the current page path is Activity a- > View A4- > Activity B and the target control path is Activity a- > View A1.
It will be appreciated that the target control path may be one or more, and the current page path may be one or more. When the paths are multiple, the representative can open the page corresponding to the target control in various modes in the application, namely, the same target control node can be reached through multiple paths from the root node.
Step 1004: and calculating a common path of the current page path and the target control path, and judging whether the common path contains the current interface path.
Specifically, the common path is a path of a portion where the current page path coincides with the target control path. For example, as shown in fig. 7, if the current page is Activity B and the target control is View B2, the current page path is Activity a- > View A4- > Activity B, the target control path is Activity a- > View A4- > Activity B- > View B2, and at this time, the common path between the current page path and the target control path is Activity a- > View A4- > Activity B. For example, as shown in fig. 7, if the current page is Activity B and the target control is View A1, the current page path is Activity a- > View A4- > Activity B, the target control path is Activity a- > View A1, and at this time, the common path between the current page path and the target control path is Activity a.
It should be noted that, the common path including the current interface path means that the common path includes a complete current page path, if the common path is set a, the current page path is set B, and if the common path includes the current interface path, the current page path is equivalent to that the set B is a subset of the set a. Taking the example in the above description as an example, if the current page is Activity B, the target control is View B2, the common path between the current page path and the target control path is Activity a- > View A4- > Activity B, and the current page path is Activity a- > View A4- > Activity B, it may be determined that the common path includes the current page path. That is, for the target control node View B2, the path from the root node to the View B2 node needs to go through the current page path, i.e. the path from the current page node to the target control node through its child nodes. If the current page is Activity B, the target control is View A1, the common path of the current page path and the target control path is Activity A, and the current page path is Activity A- > View A4- > Activity B, it can be determined that the common path does not contain the current page path. That is, for the target control node View A1, the path from the root node to the View A1 node does not need to be passed through the current page, i.e. the current page node cannot be triggered from the current page to pass through the child nodes thereof to the target control node.
Further, if it is determined that the current common path includes the current interface path, step 1005A is performed; otherwise, if it is determined that the current common path does not include the current interface path, step 1005B is performed.
Specifically, if it is determined in step 1004 that the current common path includes the current interface path, it may be determined that the target control node may be reached through its child nodes from the current page node, thereby determining a step of reaching the target control node. For example, as shown in fig. 7, if the current page is Activity B and the target control is View B2, the current page path is Activity a- > View A4- > Activity B, the target control path is Activity a- > View A4- > View B2, at this time, the common path between the current page path and the target control path is Activity a- > View A4- > Activity B, and the common path includes the current page path, it may be determined that the current page Activity B node may reach View B2 through its child node, and according to fig. 7, the step from Activity B to the target control node View B2 is Activity B- > View B2.
Specifically, the leaf node where the common path coincides with the current page path is the top-most node where the common path coincides with the current page path. After determining the leaf node, triggering from the leaf node can determine the step of reaching the target control node. For example, as shown in fig. 7, if the current page is Activity C and the target control node is View B4, the current page path is Activity a- > View A4- > Activity B- > View B2- > Activity C and the target control path is Activity a- > View A4- > Activity B- > View B3, and at this time, the common path of the current page path and the target control path is Activity a- > View A4- > Activity B, so that it can be determined that the leaf node where the common path coincides with the current page path is the page Activity B node. At this time, a step from the Activity B node to the target control node View B4 after returning to the leaf node, i.e., activity B- > View B4 may be determined. For example, as shown in fig. 7, if the current page is Activity B and the target control is View A1, the current page path is Activity a- > View A4- > Activity B, and the target control path is Activity a- > View A1, and at this time, the common path of the current page path and the target control path is Activity a, so it may be determined that the leaf node where the common path coincides with the current page path is the page Activity a node. At this time, a step from the Activity a node to the target control node View A1 after returning to the leaf node, that is, activity B- > View A1 may be determined.
It should be noted that, the leaf node where the common path coincides with the current page path may be a root node of the control page tree, such as an Activity a node in fig. 7; or may be a child of the root node of the control page tree, such as the Activity B node in FIG. 7.
Step 1006: and opening a page corresponding to the target control node.
Specifically, according to the step determined in step 1005A or step 1005B, the target control node may be reached, thereby opening a page corresponding to the target node. As shown in fig. 7, for example, if the target node is View B2, the child node Activity C page of the View B2 node may be opened.
By the method, positioning of a specific control in the application can be achieved, the relative position of a target control node to a current page node is found in a matching mode according to the established dependency relationship between the control and the page, the path from the current page node to the target control node is obtained, whether the child node needs to be returned to the common path of the current page node and the target control node is judged, the path reaching the target control is determined, and the page corresponding to the control is opened. By using the scheme, the user can be quickly and conveniently helped to find the control which can be interacted in the application, and the page corresponding to the control is opened.
The embodiment of the application also provides a method for positioning the control, as shown in fig. 11, which specifically includes:
step 1101: the electronic device receives first voice information from a user.
Specifically, the user may input first voice information to the electronic device at an interface where the application is opened. The electronic device may accept the first voice information from the user through the voice assistant. The voice assistant may turn on the voice assistant in a preset manner, such as by calling "small art" to turn on the voice assistant or by double clicking on a power key. Fig. 12A (a) is an interface for opening a voice assistant in an application, where a user may interact with the voice assistant by voice, for example, the voice assistant may be said to "open notification", "open application and service", to send a first voice message to the electronic device.
Step 1102: the electronic device converts the first voice information into first text information and transmits the first text information to a control search service in the application.
Specifically, the electronic device may convert the first voice information into first text information and display the first text information on an interface of the voice assistant, as shown in fig. 12A (b). The above-mentioned conversion of the first Speech information into the first Text information may be implemented by Speech recognition, such as Speech To Text (STT), etc.
The first text information includes a first keyword, and as shown in fig. 12A (b), the first keyword may be highlighted by thickening, underlining, or the like. For details of the first text information and the first keyword, reference is made to the foregoing embodiments, and details are not repeated herein.
Further, the electronic device may communicate the first text information to a control search service (Widget Search Service) within the application for control positioning processing. The application may be an application that is currently being opened by the foreground, such as the "setup" application shown in fig. 12A, and the application that is currently being opened by the foreground may be a desktop application, a system application, or a third party application. The application may also be an application running in the background.
Step 1103: and the control searching service searches the target control according to the first text information.
Specifically, the method for searching the target control by the control search service according to the first text information may refer to the method shown in fig. 9 and fig. 10, which is not described herein.
Step 1104: and the first electronic equipment broadcasts the search result and executes the search result.
Specifically, the broadcast results described above include two cases, and the two cases are described below.
A situation is that the control is successfully positioned, the control is matched with a target control, a page corresponding to the target control is opened, and a successful search result and the opened page are broadcasted at the same time. For example, as shown in fig. 12B (a), if the first voice information input by the user is "open application and service", the electronic device may broadcast "match to application and service page and open" through the voice assistant and display the interface of the application and service page if the control is positioned successfully.
And if the control fails to be positioned and is not matched with the target control, continuing to display the current page, and broadcasting a search failure result. For example, as shown in fig. 12B (B), if the first voice information input by the user is "open address book", the electronic device may broadcast "no matching page found" through the voice assistant and the interface of the current page under the condition that the control positioning fails.
There may be three cases in which the search results are performed: (1) The electronic equipment sequentially opens the pages passing through in the path from the current page to the page to be opened, but only displays the process of opening the page to be opened; (2) The electronic equipment sequentially opens the pages passing through the path of the page to be opened from the current page, and displays the opening process of all the pages; (3) The electronic device directly opens the page to be opened and displays the process.
For example, if the user opens the application launch management page using a voice assistant in the setup page (Activity a) shown in fig. 4. For the first case, the electronic device will open the application and service page (Activity B) and then open the application start management page (Activity C), but only display the process of opening the application start management page, that is, the dynamic effect of opening the application start management page visible to the user; for the second case, the electronic device will first open the application and service page (Activity B), then open the application start management page (Activity C), and will display the process of opening the two pages, that is, the user can see the dynamic effects of opening the application and service page first, and then open the dynamic effects of the application start management page; for the third case, the electronic device directly opens the application start management page and displays the process, namely, the dynamic effect of opening the application start management page is visible to the user.
By the method, the control which is required to be opened by the user and input through the voice can be positioned by using the voice assistant in the application and through the control searching service, and the page corresponding to the control is opened. By using the scheme, the user can be helped to find the control which can be interacted in the application in a non-touch mode, and a convenient method for opening the application page is provided for special crowds and under the scene of inconvenient touch.
It should be appreciated that the description of a feature, aspect, or similar language in this application does not imply that all of the features may be implemented in any single embodiment. The technical features and technical solutions described in the present embodiment may also be combined in any suitable manner.
It will be appreciated that, in order to implement the above-mentioned functions, the above-mentioned terminal includes corresponding hardware structures and/or software modules for executing the respective functions. Those of skill in the art will readily appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The foregoing is merely a specific implementation of the embodiments of the present application, but the protection scope of the embodiments of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the embodiments of the present application should be covered by the protection scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.
Claims (12)
1. A control positioning method applied to an electronic device, the method comprising:
the electronic equipment displays a first page of a first application;
the electronic equipment receives a first instruction, wherein the first instruction indicates to open a second page corresponding to a target control;
the electronic equipment determines a step for opening a second page corresponding to the target control;
and according to the step of opening the second page corresponding to the target control, the electronic equipment opens and displays the second page corresponding to the target control.
2. The method of claim 1, wherein the electronic device for opening and displaying a second page corresponding to the target control comprises:
the electronic device opens a third page and the second page successively, the third page including the target control.
3. The method of claim 2, wherein the method further comprises: the electronic device displays the third page after opening the third page and before opening the second page.
4. A method as claimed in any one of claims 1 to 3, wherein the first instruction comprises a first voice instruction;
The method further comprises the steps of: the electronic equipment converts the first voice instruction into a first text;
and the electronic equipment determines the target control according to the first text, wherein the first text comprises a first keyword, and the first keyword is matched with the target control.
5. The method of claim 4, wherein after the electronic device opens and displays a second page corresponding to the target control, the method further comprises:
and the electronic equipment plays first voice information, wherein the first voice information is used for indicating that the electronic equipment opens a second page corresponding to the target control.
6. The method of any of claims 1-5, wherein the step of the electronic device determining to open a second page corresponding to the target control comprises:
the electronic device determines the step of opening the second page corresponding to the target control according to a control page tree, wherein the control page tree comprises a control node, a page node and a relation between the control node and the page node, the control node corresponds to a control included in the first application, and the page node corresponds to a page included in the first application.
7. The method of claim 6, wherein the step of the electronic device determining to open a second page corresponding to the target control from a control page tree comprises:
the electronic equipment determines a path from a first page node to a target control node according to the control page tree, wherein the first page node is a node corresponding to the first page, and the target control node is a node corresponding to the target control;
and the electronic equipment determines the step of opening the second page corresponding to the target control according to the path from the first page node to the target control node.
8. The method of claim 7, wherein the electronic device determining a path from the first page node to the target control node from the control page tree comprises:
the electronic equipment acquires a first path from a home page node to the target control node and a second path from the home page node to the first page node, wherein the home page node is a node corresponding to a home page of the application;
and if the first path comprises the second path, determining a path from the first page node to the target control node through the child node of the first page node.
9. The method of claim 7, wherein the electronic device determining a path from the first page node to the target control node from the control page tree comprises:
the electronic equipment acquires a first path from a home page node to the target control node and a second path from the home page node to the first page node, wherein the home page node is a node corresponding to a home page of the application;
and if the first path does not comprise the second path, determining a path from the first page node to the target control node through a third page node, wherein the third page node is the bottommost node in the common path of the first path and the second path.
10. An electronic device, comprising: one or more processors, memory, touch screen, and one or more computer programs; the one or more processors, the memory, the touch screen, and the one or more computer programs are connected by one or more communication buses; the touch screen includes a touch surface and a display, the one or more computer programs being stored in the memory and configured to be executed by the one or more processors; the one or more computer programs include instructions for performing the control positioning method of any of claims 1-9.
11. A computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the control positioning method of any of claims 1-9.
12. A computer program product, characterized in that the computer program product, when run on a computer, causes the computer to perform the control positioning method according to any of claims 1-9.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111559609 | 2021-12-20 | ||
CN2021115596099 | 2021-12-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116301510A true CN116301510A (en) | 2023-06-23 |
Family
ID=86800022
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210010061.0A Pending CN116301510A (en) | 2021-12-20 | 2022-01-06 | Control positioning method and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116301510A (en) |
-
2022
- 2022-01-06 CN CN202210010061.0A patent/CN116301510A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3872807B1 (en) | Voice control method and electronic device | |
CN115866121B (en) | Application interface interaction method, electronic device and computer readable storage medium | |
CN113885759A (en) | Notification message processing method, device, system and computer readable storage medium | |
CN114040242B (en) | Screen projection method, electronic equipment and storage medium | |
CN116360725B (en) | Display interaction system, display method and device | |
CN111970401B (en) | Call content processing method, electronic equipment and storage medium | |
CN113254409A (en) | File sharing method, system and related equipment | |
EP4293997A1 (en) | Display method, electronic device, and system | |
CN113641271A (en) | Application window management method, terminal device and computer readable storage medium | |
WO2023273543A1 (en) | Folder management method and apparatus | |
CN112740148A (en) | Method for inputting information into input box and electronic equipment | |
CN115022982B (en) | Multi-screen cooperative non-inductive access method, electronic equipment and storage medium | |
CN112416984A (en) | Data processing method and device | |
US20240303022A1 (en) | Method for invoking capability of another device, electronic device, and system | |
WO2022062902A1 (en) | File transfer method and electronic device | |
CN113950045B (en) | Subscription data downloading method and electronic equipment | |
CN112579425B (en) | Method, device and medium for testing applet method interface | |
CN116527266A (en) | Data aggregation method and related equipment | |
CN116339569A (en) | Split screen display method, folding screen device and computer readable storage medium | |
CN116301510A (en) | Control positioning method and electronic equipment | |
CN116048629B (en) | System service switching method, control device, electronic equipment and storage medium | |
CN114666441B (en) | Method for calling capabilities of other devices, electronic device, system and storage medium | |
CN114817521B (en) | Searching method and electronic equipment | |
EP4290375A1 (en) | Display method, electronic device and system | |
WO2023221895A1 (en) | Target information processing method and apparatus, and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |