CN114201738B - Unlocking method and electronic equipment - Google Patents

Unlocking method and electronic equipment Download PDF

Info

Publication number
CN114201738B
CN114201738B CN202010911832.4A CN202010911832A CN114201738B CN 114201738 B CN114201738 B CN 114201738B CN 202010911832 A CN202010911832 A CN 202010911832A CN 114201738 B CN114201738 B CN 114201738B
Authority
CN
China
Prior art keywords
electronic device
target equipment
user
screen
mobile phone
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010911832.4A
Other languages
Chinese (zh)
Other versions
CN114201738A (en
Inventor
杨诗姝
杨桐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202010911832.4A priority Critical patent/CN114201738B/en
Priority to PCT/CN2021/113610 priority patent/WO2022048453A1/en
Publication of CN114201738A publication Critical patent/CN114201738A/en
Application granted granted Critical
Publication of CN114201738B publication Critical patent/CN114201738B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Telephone Function (AREA)

Abstract

The application relates to the technical field of terminal equipment and provides an unlocking method and electronic equipment. The unlocking method comprises the following steps: receiving touch operation input by a user in a screen-off display interface, wherein the screen-off display interface comprises one or more equipment identifiers, and the equipment identifiers are identifiers of peripheral equipment which can be connected with electronic equipment and/or connected peripheral equipment; if the touch operation is determined to act on the target equipment identifier, starting user identity authentication; the target device identity is any one of the one or more device identities. And if the user identity authentication is passed, unlocking and starting a shortcut corresponding to the target equipment identifier. The method and the device improve the efficiency of cross-device interaction.

Description

Unlocking method and electronic equipment
Technical Field
The application relates to the technical field of terminal equipment, in particular to an unlocking method and electronic equipment.
Background
With the rapid development of terminal technology, the terminal devices can realize more and more functions. For example, applications installed on terminal devices are increasing, and functions of the applications are also increasing.
In practical use, when a certain application or a certain function of the terminal device is required to be used, a user needs to unlock the terminal device first, then open the application, or open the application to enter a functional interface.
As can be seen, the user needs to perform complicated operations to use a certain application or function, which is inefficient.
Disclosure of Invention
The embodiment of the application provides an unlocking method and electronic equipment, which can solve at least one technical problem related to the prior art.
In a first aspect, an embodiment of the present application provides an unlocking method, which is applied to an electronic device, where the unlocking method includes:
receiving touch operation input by a user in a screen-off display interface, wherein the screen-off display interface comprises one or more equipment identifiers, and the equipment identifiers are identifiers of peripheral equipment which can be connected with electronic equipment and/or connected peripheral equipment;
if the touch operation is determined to act on the target equipment identifier, starting user identity authentication; the target device identifier is any one of the one or more device identifiers;
and if the user identity authentication is passed, unlocking and starting a shortcut corresponding to the target equipment identifier.
According to the embodiment of the first aspect of the application, the device identification of the peripheral device is displayed in the screen-off display interface, so that a user can quickly unlock the electronic device and enable a shortcut path aiming at the target device, and efficiency of cross-device interaction under the screen-off is improved.
As a possible implementation manner of the first aspect, the distribution of the one or more device identifiers in the information screen display interface is mapped according to a spatial relationship between each of the peripheral devices and the electronic device.
In the implementation mode, the distribution of the device identifiers is mapped according to the spatial relationship between the peripheral devices and the electronic devices, so that a user can quickly and accurately select the target device identifiers, a shortcut for the target device is started, and the efficiency and accuracy of cross-device interaction under the screen are improved.
As a possible implementation manner of the first aspect, the spatial relationship includes a spatial relationship of positioning and/or orientation.
As a possible implementation manner of the first aspect, in a case where the spatial relationship includes a spatial relationship of positioning and orientation, the spatial relationship includes a distance between each of the peripheral devices and the electronic device, and an included angle between a connection line of each of the peripheral devices and the electronic device and an orientation of the electronic device.
As a possible implementation manner of the first aspect, the information screen display interface further includes an orientation identifier of the electronic device.
In the implementation mode, the orientation of the electronic equipment is identified through the visualization of the information screen display interface, so that a user can rapidly correspond the equipment identification distribution of the information screen display interface to an actual scene, and the operation efficiency and accuracy are improved.
As a possible implementation manner of the first aspect, the enabling a shortcut corresponding to the target device identifier includes:
controlling target equipment corresponding to the target equipment identifier to respond to a preset instruction; or alternatively, the first and second heat exchangers may be,
displaying a control panel interface of the target equipment corresponding to the target equipment identifier; or alternatively, the first and second heat exchangers may be,
calling up a display interface of the target equipment corresponding to the target equipment identifier; or alternatively, the first and second heat exchangers may be,
and projecting a screen or a sound to the target equipment corresponding to the target equipment identifier.
In the implementation mode, the diversification of the quick paths is realized, so that the method and the device can be applied to different application scenes and have stronger environmental adaptability.
As a possible implementation manner of the first aspect, the touch operation includes a finger pressing operation, and the user authentication includes user authentication based on fingerprint identification.
In the implementation mode, the electronic equipment based on full-screen fingerprint identification combines the user operation of enabling the shortcut and user identification, so that complicated operation is reduced, and the operation efficiency is improved.
As a possible implementation manner of the first aspect, the one or more device identifiers satisfy a first condition, where the first condition includes an upper number limit of the device identifiers, and/or a deviation angle between a peripheral device corresponding to the device identifier and an electronic device is less than or equal to a maximum deviation angle.
In the implementation mode, the number of the device identifications displayed in the screen-extinguishing display interface is reduced, and the problem that when the peripheral devices are too many, the device identifications overlap to cause the user to mistakenly select the target device can be avoided, so that the interaction accuracy is improved.
In a second aspect, corresponding to the unlocking method provided in the first aspect, an unlocking device is provided and configured in an electronic device, where the unlocking device includes:
the device comprises a receiving module, a display module and a display module, wherein the receiving module is used for receiving touch operation input by a user in a screen-extinguishing display interface, the screen-extinguishing display interface comprises one or more device identifiers, and the device identifiers are identifiers of peripheral devices which can be connected with electronic equipment and/or connected peripheral devices;
the authentication module is used for starting user identity authentication if the touch operation is determined to act on the target equipment identifier; the target device identifier is any one of the one or more device identifiers;
and the unlocking module is used for unlocking and starting the shortcut corresponding to the target equipment identifier if the user identity authentication passes.
As a possible implementation manner of the second aspect, the distribution of the one or more device identifiers in the information screen display interface is mapped according to a spatial relationship between each of the peripheral devices and the electronic device.
As a possible implementation manner of the second aspect, the spatial relationship includes a spatial relationship of positioning and/or orientation.
As a possible implementation manner of the second aspect, in a case where the spatial relationship includes a spatial relationship of positioning and orientation, the spatial relationship includes a distance between each of the peripheral devices and the electronic device, and an included angle between a connection line of each of the peripheral devices and the electronic device and an orientation of the electronic device.
As a possible implementation manner of the second aspect, the information screen display interface further includes an orientation identifier of the electronic device.
As a possible implementation manner of the second aspect, the unlocking module is specifically configured to:
unlocking and controlling target equipment corresponding to the target equipment identifier to respond to a preset instruction; or alternatively, the first and second heat exchangers may be,
unlocking and displaying a control panel interface of the target equipment corresponding to the target equipment identifier; or alternatively, the first and second heat exchangers may be,
unlocking and calling up a display interface of the target device corresponding to the target device identifier; or alternatively, the first and second heat exchangers may be,
unlocking and projecting a screen or a sound to the target equipment corresponding to the target equipment identifier.
As a possible implementation manner of the second aspect, the touch operation includes a finger pressing operation, and the user authentication includes user authentication based on fingerprint identification.
As a possible implementation manner of the second aspect, the one or more device identifiers satisfy a first condition, where the first condition includes an upper number limit of the device identifiers, and/or a deviation angle between a peripheral device corresponding to the device identifier and an electronic device is less than or equal to a maximum deviation angle.
In a third aspect, an embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to cause the electronic device to implement a method according to any one of the first aspect and the possible implementation manners of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program which, when executed by a processor, implements a method as in any one of the first aspect and the possible implementation manners of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product, which when run on an electronic device, causes the electronic device to perform the method according to any one of the above-mentioned first aspect and possible implementations of the first aspect.
It will be appreciated that the benefits of the second to fifth aspects described above may be seen from the relevant description of the first aspect described above.
Drawings
Fig. 1 is a schematic diagram of a screen display interface of a mobile phone according to an embodiment of the present application;
FIG. 2A is a schematic diagram of a positioning system according to an embodiment of the present disclosure;
FIG. 2B is a schematic diagram of a positioning principle according to an embodiment of the present application;
FIG. 2C is a schematic diagram of a directional principle according to an embodiment of the present application;
FIG. 2D is a schematic diagram of another orientation principle provided by an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 4 is a software architecture diagram of an electronic device according to an embodiment of the present application;
fig. 5 is a first application scenario provided in an embodiment of the present application;
FIG. 6A is a schematic diagram of a setup interface of a mobile phone according to an embodiment of the present application;
FIG. 6B is a schematic diagram of another configuration interface of a mobile phone according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a setup interface of a mobile phone according to an embodiment of the present application;
FIG. 8 is a schematic diagram of another configuration interface of a mobile phone according to an embodiment of the present disclosure;
fig. 9A is a schematic diagram of a screen display interface of a mobile phone according to an embodiment of the present application;
Fig. 9B is a schematic diagram of a screen display interface of a mobile phone according to an embodiment of the present application;
fig. 10 is a third application scenario provided in an embodiment of the present application;
FIG. 11A is a schematic diagram of a screen display interface of a mobile phone according to an embodiment of the present disclosure;
FIG. 11B is a schematic diagram of a control panel interface of a mobile phone display television according to an embodiment of the present application;
fig. 12A is a schematic diagram of a screen display interface of a mobile phone according to an embodiment of the present application;
FIG. 12B is a schematic diagram of a display interface of a tablet computer set up on a mobile phone according to an embodiment of the present application;
FIG. 13A is a schematic diagram of the positioning of a first mobile phone, a television, and a tablet computer in a third application scenario;
fig. 13B is a schematic diagram of an included angle between the first mobile phone and the television in the third application scenario;
fig. 13C is another schematic diagram of an included angle between a mobile phone and a television in a third application scenario;
fig. 13D is a schematic diagram of a mobile phone orientation in a third application scenario;
fig. 14 is a schematic view of a scene change of a third application scene before and after the handset rotates counterclockwise;
fig. 15 is a schematic diagram of a change of a screen display interface of the mobile phone before and after counterclockwise rotation in a third application scenario;
fig. 16 is a schematic diagram of a change of another screen display interface of the mobile phone before and after the mobile phone rotates counterclockwise in the third application scenario;
FIG. 17 is a flowchart of an unlocking method according to an embodiment of the present application;
fig. 18 is a flowchart of an implementation of an unlocking method according to another embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The terminology used in the following embodiments is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary.
It should also be understood that in embodiments of the present application, "a number" and "one or more" refer to one, two, or more than two; "and/or", describes an association relationship of the association object, indicating that three relationships may exist; for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
The term "comprises/comprising" when used in this specification and the appended claims is taken to specify the presence of stated features, integers, steps, operations, elements, and/or components, but does not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used in this specification and the appended claims, the term "if" or "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
Currently, applications installed in electronic devices are increasing, and when a user wants to open an application or use a function of an application, a generally complex user operation is required to access an interface of the application or the function. Therefore, the unlocking method and the electronic device are provided, the operation complexity of a user is reduced, and the operation efficiency is improved.
For a better understanding of the technical solutions of the present application, several important technical terms related to the present application are introduced first.
Display screen (Always On Display AOD)
The screen-extinguishing display refers to directly displaying the contents such as time, temperature, date, calendar, incoming call information, push message and the like in a partial area of a screen on the premise that the whole screen of the electronic equipment is not lightened.
As a non-limiting example, FIG. 1 shows a message display interface for a mobile phone. In the rest display interface shown in fig. 1, a partial area of the mobile phone screen is lit for displaying time, date and electricity.
In practical applications, when the screen of the electronic device is not lit in its entirety, the electronic device is usually in a screen-locking state.
Wireless positioning technology
Wireless location technology refers to a measurement method, i.e., a location algorithm, for acquiring mobile location information in various different wireless networks. Wireless location technologies include, but are not limited to, ultra Wide Band (UWB), wireless fidelity (wireless fidelity, wi-Fi), bluetooth (BT), or the like.
The most commonly used wireless positioning algorithms at present mainly comprise: an Angle of Arrival (AOA) based positioning algorithm, a Time of Arrival (TOA) based positioning algorithm, a Time difference of Arrival (Time Difference of Arrival, TDOA) based positioning algorithm, or a received signal strength (Received Signal Strength, RSS) based positioning algorithm, etc. In some implementations, multiple positioning algorithms may be combined for positioning.
Due to the high time resolution of UWB signals, the TOA and TDOA positioning algorithms have higher accuracy relative to other positioning algorithms. At present, a more effective solution for UWB positioning is to adopt a TOA and TDOA hybrid positioning algorithm, and high positioning accuracy can be realized because the two positioning algorithms complement each other and combine the advantages of the two positioning algorithms.
For convenience of description, embodiments of the present application will be described by taking UWB positioning technology based on TOA positioning algorithm as an example. It should be understood that the UWB positioning technology is not to be construed as a specific limitation of the present application, and all positioning methods that can implement the technical solutions of the present application may be used in the present application.
TOA positioning algorithms refer to the transmission of specific ranging commands or command signals from a base station to an electronic device and require the electronic device to respond to the command. The base station records the time taken by the ranging command to receive the acknowledgement signal from the electronic device, which is mainly composed of the propagation delay of the radio frequency signal (e.g., UWB signal) on the loop, the response delay and processing delay of the electronic device, and the processing delay of the base station. If the response and processing delays of the electronic device and the base station can be accurately obtained, the loop propagation delay of the radio frequency signal (e.g., UWB signal) can be calculated. Since radio waves propagate in air at the speed of light, the distance between the base station and the electronic device can be estimated. When three base stations are involved in the measurement, the location (e.g., of the UWB signal) can be determined based on triangulation.
As a non-limiting example, as shown in fig. 2A, a UWB positioning system includes: 3 base stations (which may be referred to as UWB base stations) carrying UWB modules and handsets (i.e. tags to be located) carrying UWB modules. With UWB positioning technology, UWB base stations need to be installed in the environment in advance. In this example, 3 UWB base stations, namely, a first UWB base station 21, a second UWB base station 22, and a third UWB base station 23 are installed. Handset 20 in the environment is located by relying on 3 UWB base stations.
According to the illustration of fig. 2B, the position coordinates (x 0, y 0) of the handset 20 are obtained from the first distance r1 of the first UWB base station 21 to the handset 20, the second distance r2 of the second UWB base station 22 to the handset 20, the third distance r3 of the third UWB base station 23 to the handset 20, in combination with the known position coordinates (x 1, y 1) of the first UWB base station 21, the known position coordinates (x 2, y 2) of the second UWB base station 22, and the known position coordinates (x 3, y 3) of the third UWB base station 23.
UWB orientation may be achieved in two ways.
The first is the AOA measurement method in which a UWB device requiring orientation uses multiple antennas, while a UWB base station or UWB base station-dependent device uses only one antenna. As shown in fig. 2C, the UWB base station transmits special data packets through a single antenna by a low power consumption transmitter. The surrounding low-power-consumption receiver, such as a receiver of a mobile phone, is provided with a plurality of antennas which are arranged in an array, and each antenna of the receiver can find the phase difference of the received signals due to different distances from the plurality of antennas to the transmitter, and finally the relative direction of the signals, such as the angle theta of the mobile phone relative to the UWB base station, is calculated through data.
The second is an Angle Of Departure (AOD) measurement method in which only one antenna is used by UWB devices that need to be oriented, and multiple antennas are used by UWB base stations or devices upon which UWB base stations depend. As shown in fig. 2D, the mobile phone may receive signals through a low power consumption receiver, and the UWB base station may transmit special data packets when switching between active antennas arranged in a array through the low power consumption transmitter. And the receiver of the mobile phone acquires an IQ sample from the received signal, knows the antenna arrangement in the transmitter, and finally calculates the relative direction of the signal through data, for example, the angles of the mobile phone relative to two UWB base stations are respectively theta 1 and theta 2.
Through UWB orientation technology, can calculate the angle of cell-phone relative to the electronic equipment such as TV, intelligent stereo set, tablet computer that carry UWB module.
In order to illustrate the technical solution of the present application, the following description is made by specific examples.
The unlocking method provided by the embodiment of the application can be applied to electronic equipment, wherein the electronic equipment comprises, but is not limited to, a mobile phone with a touch display screen, a wearable device, a vehicle-mounted device, an augmented reality (augmented reality, AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a tablet computer, an intelligent sound box, a television or the like. The embodiment of the application does not limit the specific type of the electronic device.
In some embodiments of the present application, the electronic device may include a portable, handheld, or mobile electronic device, such as a cell phone, tablet, wearable device, or portable game console, among others.
Fig. 3 is a schematic diagram of the electronic device 100, taking a mobile phone as an example.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present invention is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wi-Fi network), BT, global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared (IR), UWB communication technology, etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, UWB, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the invention, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 4 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present invention.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 4, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 4, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the electronic device 100 software and hardware is illustrated.
When the touch sensor 180K of the electronic device 100 receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, time stamp of touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the input event.
The application scenario and implementation procedure of the embodiment of the present application are illustrated by the following non-limiting examples. For convenience of description, in the following application scenario, a mobile phone is described as an example of an electronic device. It should be noted that, the exemplary description of a certain application scenario is not limited to this application scenario. Indeed, various modifications, combinations, substitutions or alterations are contemplated for example in view of the detailed description of the various application scenarios without departing from this application.
First application scenario
The first application scene is a scene for unlocking the mobile phone under the screen-off display and entering a specific function or a specific application of the mobile phone.
Currently, the under-screen fingerprint technology is mature, all large terminal manufacturers start to exert force on the aspect of the comprehensive screen fingerprint technology, and related comprehensive screen fingerprint unlocking terminal products are also developed. In the first application scene, the mobile phone performs user identity authentication in an on-screen fingerprint identification mode. The mobile phone is a mobile phone supporting comprehensive screen fingerprint identification.
As shown in a diagram a of fig. 5, in the screen display interface of the mobile phone, 6 blocks are displayed in addition to the time, date and electricity. One of the 6 blocks is a first block 51 and one is a second block 52.
When the user prepares to pay with the mobile phone at the convenience store, the mobile phone receives the user's finger, e.g., left hand or right handThumb, etc., pressing the first block 51 shown in the diagram a in fig. 5, and directly entering into the payment device from the screen display interface after the fingerprint identification is passed, i.e. the user authentication is successful, the mobile phone is unlocked TM WeChat TM Or a payment code interface provided by an application such as a financial client. The payment code interface may be as shown in fig. 5B, where the payment code is a two-dimensional code.
After payment is completed for a period of time, the mobile phone automatically locks the screen and displays a screen display interface. If the user wants to quickly check e.g. WeChat TM And waiting for chat messages of the instant messaging application. The mobile phone receives the finger of the user, such as the thumb of the left hand or the thumb of the right hand, and the like, and when the fingerprint identification is passed, i.e. the user identity authentication is successful, the mobile phone can directly enter the chat information list display interface from the screen display interface through the pressing operation of the second block 52 shown in the diagram A in fig. 5. The display interface of the chat information list may be as shown in fig. 5C.
In this process, the mobile phone determines that the user wants to unlock the mobile phone according to the detected user operation, for example, the user picks up or shakes the mobile phone, presses a mechanical key such as a start key or a volume key, touches or taps a screen, and the mobile phone displays a screen-extinguishing display interface as shown in a diagram a in fig. 5. The user presses the fingerprint in the first block 51 to unlock the mobile phone, the mobile phone collects the user fingerprint according to the pressing operation of the user on the first area 51, authenticates the fingerprint, and after the fingerprint passes the authentication, the mobile phone is on the screen to display a payment code display interface. After the user finishes paying by scanning the payment code, the mobile phone can automatically enter a screen locking state or enter the screen locking state according to the received screen locking operation (such as pressing a start key and the like) input by the user. When the mobile phone enters a screen locking state, the mobile phone is turned off (i.e. turned off), after the mobile phone is turned off for a period of time, the user wants to view the chat message, the mobile phone determines that the user wants to unlock the mobile phone according to the detected user operation or action, such as touching the mobile phone screen by a finger of the user, and the like, the mobile phone presents a screen-off display interface, the user presses the fingerprint at the second block 52 of the screen-off display interface to unlock the mobile phone, the mobile phone is turned on, and a display interface of the chat information list is presented.
In the screen-rest display interface shown in a diagram in fig. 5, 6 blocks are displayed. The remaining 4 blocks may each correspond to a specific function or application interface, except for the first block 51 and the second block 52. The mobile phone performs fingerprint identification according to the received pressing operation of the user, and when the fingerprint identification passes, namely the user identity authentication is successful, the mobile phone enters a function or an application interface corresponding to the block.
In the example shown in fig. 5, a two-dimensional code is used as the payment code in the payment code interface. In other examples, the payment code interface may also employ a bar code, or a combination of a bar code and a two-dimensional code, or the like, as the payment code. The specific presentation form of the payment code is not particularly limited in the embodiments of the present application.
According to the embodiment of the application, the unlocking areas are divided on the terminal screen supporting the fingerprint under the full screen by combining the visual feedback displayed by the screen, and each area is used as a shortcut for entering a specific function or application. The user can enter specific functions or applications through fingerprint unlocking quickly, and the purpose of improving the operation efficiency of the user is achieved.
It should be noted that, the number of blocks included in the screen-extinguishing display interface and/or the function or application corresponding to each block may be set by default or may be set by user definition.
In some possible implementations, the electronic device may provide a setting interface for the shortcut, where the user may modify or edit the number of blocks in the information screen display interface, and/or the specific function or application corresponding to each block, which is not limited in this application.
As a non-limiting example, as shown in fig. 6A and 6B, the shortcut setting interface 61 of the mobile phone provides 6 selectable shortcut paths, one for each bar control. For example, the upper left block corresponds to bar control 611 and the lower left block corresponds to bar control 612.
As shown in fig. 6A, the mobile phone receives the click operation of the user on the bar control 611, determines that the block to be set by the user is the upper left block, and presents the setting interface 62 for the upper left block. The upper left tile setup interface 62 may include a tile layout display area 621, a system recommendation area 622, and a user-defined area 623. The block layout display area 621 is used for displaying the position of the upper left area in the whole layout, and performing visual identification on the currently set block, so that a user can conveniently distinguish different blocks. In the tile layout display area 621 shown in fig. 6A, the top left tile is displayed in a different pattern from the other 5 tiles to illustrate the distinction. The system recommendation area 622 is used to display the functions or applications of the system recommendation settings. In the system recommendation area 622 shown in fig. 6A, a function of a car riding code in a certain application of system recommendation is displayed, a user may click on the switch control 6221 in the system recommendation area 622, and the mobile phone may set the upper left block to correspond to the function or the application of system recommendation, that is, the function of the car riding code according to the click operation of opening the switch control 6221 input by the user. The user-defined area 623 is used for user-defined setting of functions or applications corresponding to the upper left block. The user clicks the drop-down menu control 6231 in the user-defined area 623 to select a function or application in the drop-down menu list as the corresponding function or application in the upper left block.
As shown in fig. 6B, the mobile phone receives the click operation of the user on the bar control 612, determines that the block to be set by the user this time is the lower left block, and presents the setting interface 63 for the lower left block. The lower left tile setup interface 63 may include a tile layout display area 631, a system recommendation area 632, and a user-defined area 633. The block layout display area 631 is used for displaying the position of the lower left area in the whole layout, and performing visual identification on the currently set block, so that a user can conveniently distinguish different blocks. In the tile layout display area 631 shown in fig. 6B, the lower left tile is displayed in a different pattern from the other 5 tiles to illustrate the distinction. The system recommendation area 632 is used to display the functions or applications of the system recommendation settings. In the system recommendation area 632 shown in fig. 6B, the payment code function in a certain application recommended by the system is displayed, the user may click on the switch control 6321 in the system recommendation area 632, and the mobile phone may set the lower left block to correspond to the function or the application recommended by the system, that is, the payment code function according to the click operation of opening the switch control 6321 input by the user. The user-defined area 633 is used for user-defined setting of functions or applications corresponding to the lower left block. The user clicks the drop-down menu control 6331 in the user-defined area 633, and may select a function or application in the drop-down menu list as the function or application corresponding to the lower left block.
As shown in fig. 6A and 6B, the shortcut setting interface 61 includes 6 selectable shortcut paths, in which two blocks are provided with corresponding shortcut paths. The lower left block is set to correspond to the pay code function and the middle right block is set to correspond to the chat information list of an application. Based on this setting, the user can enter the function or application to which the blocks respectively correspond in the screen-rest display interface by triggering the lower left block and the middle right block, for example, see the example shown in fig. 5. It should be noted that, in other examples, when two blocks are provided with corresponding shortcut paths, the two blocks may be displayed in the screen-in display interface, and other blocks not provided with corresponding functions or applications may not be displayed.
As another non-limiting example, as shown in fig. 7, the shortcut setting interface 71 of the mobile phone provides a plurality of selectable block layout forms, and different block layout forms can be switched by user operation. The layout form of 4 blocks is displayed in the layout display area 711. The user can click on the right control 7111 in the layout display area 711 or input a right-slide touch operation, and the mobile phone switches to the layout display area 712 in the form of a layout including 5 blocks according to the received click operation of the right control 7111 by the user or the right-slide touch operation input by the user. The user can click on the left control 7112 in the layout display area 711 or input a left swipe touch operation, and the mobile phone switches to the layout display area 713 in the form of a layout including 3 blocks according to the received click operation of the left control 7112 by the user or the left swipe touch operation input by the user.
Based on the example shown in fig. 7, in some implementations, the user can gradually increase the number of tiles displayed in the layout display area by inputting a first direction multiple times, such as a sliding touch operation to the right; the number of blocks displayed in the layout display area can be reduced by inputting a sliding touch operation in a direction opposite to the first direction, for example, to the left, a plurality of times. Alternatively, in other implementations, the user may gradually decrease the number of tiles displayed in the layout display area by inputting the first direction multiple times, such as a sliding touch operation to the right; the number of blocks displayed in the layout display area can be gradually increased by inputting a sliding touch operation in a direction opposite to the first direction, for example, to the left, a plurality of times.
On the basis of the example shown in fig. 7, as shown in fig. 8, the layout display area 711 in the shortcut path setting interface 71 displays the layout form of 4 blocks, and the user can change the display position of any block by long-pressing the block, that is, change the layout of the block. For example, as shown in fig. 8, the mobile phone receives a long press movement touch operation by the user on the block 7113, and changes the position of the block 7113 to a target position, that is, a lift (up) position of the long press movement touch operation.
On the basis of the examples shown in fig. 7 and 8, the layout display area 711 in the shortcut setting interface 71 displays the layout form of 4 blocks for which the user can perform setting of functions or applications by long pressing any one block. That is, the mobile phone can enter the setting interface of any block according to the long-press operation of the block input by the user, and set the function or application corresponding to the block in the setting interface of the block. The setup interface for the tiles may be similar to the setup interface 62 for the upper left tile shown in fig. 6A or the setup interface 63 for the lower left tile shown in fig. 6B.
Based on the examples shown in fig. 7 and 8, the user may customize the number and/or layout of tiles displayed in the message display interface.
It should be understood that the user interfaces shown in fig. 6A, 6B, 7, and 8 are merely exemplary descriptions. In actual use, the user interface may include more or fewer interface elements than in fig. 6A, 6B, 7, and 8, and the interface layout may be different.
In the example shown in fig. 5, a rectangle is used as the identifier of each shortcut path. In other examples, icons, patterns, text, etc. corresponding to functions or applications may also be employed as an identification of the shortcut path. For example, as shown in fig. 9A and 9B, icons or patterns corresponding to functions or applications are used as the marks of shortcut paths, so that a user can conveniently distinguish each shortcut path, the operation threshold of the user is reduced, and the operation accuracy is improved. In the example shown in fig. 9A and 9B, icon 91 in the information screen display interface corresponds to a calculator application, icon 92 corresponds to a music player application, icon 93 corresponds to a mail application, and icon 94 corresponds to a WeChat TM The pattern 95 corresponds to a two-dimensional code payment function in an application. The user can enter the calculator application interface after unlocking the handset by pressing the screen area where icon 91 is displayed. The user may enter the music player application after unlocking the handset by pressing the screen area where icon 92 is displayed. The user can enter the mailbox application after unlocking the handset by pressing the screen area where the icon 93 is displayed. The user can unlock the mobile phone and enter the WeChat by pressing the screen area displaying the icon 94 TM Application. The user can enter a two-dimensional code payment interface of an application after unlocking the mobile phone by pressing the screen area of the display pattern 95.
In the example shown in fig. 9A and 9B, the shortcut identifications in the screen-in display interface are displayed in different areas of the screen. The display area of the shortcut path identifier can be set according to system setting or user-defined setting. In some embodiments, the mobile phone screen is larger, so as to facilitate user operation and improve operation efficiency, and the screen-extinguishing display interface may be displayed in the area below the screen as shown in fig. 9B according to a default setting of the system or a user-defined setting. In other embodiments, the mobile phone screen is larger, so as to further facilitate the user operation, thereby further providing the operation efficiency, and the shortcut identifier may be displayed in the left-hand or right-hand area below the screen according to the detected holding state of the left hand or the right hand. In other embodiments, the shortcut identifier in the information screen display interface may change the display area, and similarly, other content in the information screen display interface may also change the display area, for example, change the display position of the displayed content once every preset time period; for another example, the display position of the content displayed in the screen-extinguishing display interface is different from the display position of the last time. It is to be understood that this is an exemplary description only and is not to be construed as a specific limitation upon the present application.
In other possible implementations, the mobile phone determines the function or application corresponding to each block according to one or more of the user usage habits of the installed application or function, consumed traffic, and scene information. The usage habits of the user may include usage rules, usage duration and/or usage frequency, among others. The scene information includes position information and/or motion state information, etc.
As a non-limiting example, the handset may order applications and/or functions according to the duration of use, number of uses, or amount of traffic consumed, etc. of the applications or functions over a period of time. The top ranked applications and/or functions have a greater probability of being intended by the user after unlocking than the top ranked applications and/or functions.
For example, the mobile phone defaults to set 6 shortcut paths, the mobile phone analyzes the historical usage data of the user to obtain the last week of usage ranking of the applications and/or functions, and the top 6 applications and/or functions ranked first are in turn: weChat TM Microblog, news, music, telephone, and payment treasures TM Is a payment code of (c). Therefore, the mobile phone sets the 6 applications and functions as shortcut paths, and displays the respective identifications of the 6 shortcut paths on the screen-extinguishing display interface for the user to select, so that the user can conveniently and quickly enter the applications and/or functions.
As a non-limiting example, the handset may count usage rules of an application and/or function on a daily basis based on a daily usage record of the application or function over a period of time. The period of time during which the cell phone is used in a concentrated manner in one day may include N periods of time, for example, 5 periods of time, etc. For each time period, the use duration of each application and/or function is recorded, and the applications and/or functions are ordered according to the use duration. For each time period, there is a greater probability that the top-ranked applications and/or functions are applications and/or functions that the user wants to use after unlocking than the top-ranked applications and/or functions, and therefore the top-ranked M (M is a positive integer) applications and/or functions are set as shortcuts for that time period.
For example, the mobile phone analyzes historical usage data of the user, obtains 4 time periods of centralized usage of the mobile phone every day, for example, 8:00 to 10:00,12:00 to 13:30,18:00 to 20:00, and 20:30 to 23:00, respectively counts the usage rate of each application and/or function in the 4 time periods, ranks according to the usage rate, sets M (M is a positive integer) applications and/or functions ranked at the top as shortcut paths by the mobile phone, displays respective identifications of the shortcut paths on a screen display interface for the user to select, and facilitates the user to enter the applications and/or functions quickly. It should be noted that the number of shortcut paths corresponding to the 4 time periods may be the same or different, which is not limited in the present application.
As another non-limiting example, the cell phone may determine the current location of the user based on the location, determine the environment in which the user is located based on the current location of the user, such as a mall, airport, home, subway, movie theater, or the like. And the mobile phone displays the shortcut path associated with the scene on the screen-extinguishing display interface according to the scene where the user is located.
For example, the mobile phone locates the current location, determines that the environment in which the user is located is a mall according to the current location, and sets shopping applications, payment functions and the like as shortcut paths. And displaying icons or patterns corresponding to shopping applications, payment functions and the like on the screen display interface for the user to select, so that the user can conveniently and quickly enter the applications and/or the functions.
For another example, the mobile phone locates the current position, determines that the environment where the user is located is home according to the current position, and sets the application or function for controlling the home equipment, the reading application, the music playing application and the like as the shortcut. And displaying icons or patterns corresponding to applications or functions of the home equipment, reading applications, music playing applications and the like on the screen-extinguishing display interface for a user to select, so that the user can conveniently and quickly enter the applications and/or functions.
As another non-limiting example, the mobile phone may obtain motion state information of the mobile phone according to a positioning and/or acceleration sensor, and/or a gyro sensor, etc., and may determine a motion state of the user according to the motion state information, such as running, walking, riding, etc., and the mobile phone displays a shortcut path associated with the motion state information on a screen-extinguishing display interface.
For example, the mobile phone determines that the current movement speed is low, determines that the user is in a jogging state, and sets a movement management application, a music playing application and the like as a shortcut. And displaying icons or patterns corresponding to the motion management application, the music playing application or the music control function and the like on the screen-extinguishing display interface for the user to select, so that the user can conveniently and quickly enter the application and/or the function.
Second application scenario
The second application scene is a scene for unlocking the mobile phone under the screen-off display and entering a specific function or a specific application of the mobile phone.
At present, face recognition technology is mature and safe, and at present, a terminal product is unlocked through face recognition. In the second application scenario, the mobile phone performs user identity authentication by starting the camera to perform face recognition. The mobile phone is a mobile phone supporting face recognition.
With continued reference to fig. 5, as shown in a diagram a, in the screen display interface of the mobile phone, 6 blocks are displayed in addition to the time, date and electricity. One of the 6 blocks is a first block 51 and one is a second block 52.
When the user prepares to pay by using the mobile phone in a convenience store, the mobile phone monitors the finger or the handwriting pen of the user, and the mobile phone starts the front-facing camera 53 to shoot the face image of the user when the touch screen operation of the first block 51 shown in the diagram A in fig. 5 is operated, and the user directly enters into the payment device from the screen display interface after the face identification is passed, namely the user identity authentication is successful, the mobile phone is unlocked TM WeChat TM Or a payment code interface provided by an application such as a financial client. The pay code interface may be as shown in figure 5, panel B, with pay codes employedTwo-dimensional codes.
After payment is completed for a period of time, the mobile phone locks the screen and displays a screen display interface. If the user wants to quickly check e.g. WeChat TM And waiting for chat messages of the instant messaging application. The mobile phone monitors the finger or the handwriting pen of the user, and in the touch screen operation of the second block 52 shown in the diagram A in fig. 5, the mobile phone starts the front camera 53 to shoot the face image of the user, and when the face identification passes, namely the user identity authentication is successful, the mobile phone is unlocked, and the mobile phone can directly enter the display interface of the chat information list from the screen display interface. The display interface of the chat information list may be as shown in fig. 5C.
It should be noted that, the second application scenario adopts a user identity authentication mode different from that of the first application scenario, and the rest processes are the same as those of the first application scenario, which is not repeated here.
Third application scenario
The third application scenario is a cross-device interaction application scenario. In a third application scenario, the user sits on a sofa in the home, at this time he holds his mobile phone, his own tablet computer on the left side, the television on the right side in front, the intelligent sound box on the right side, and a standby mobile phone on the left side of the sofa. In a third application scenario, including the first handset 1010, the television 1020, the smart box 1030, the tablet 1040, and the second handset 1050 (i.e., the standby handset) held by the user.
In the scenario shown in fig. 10, the first mobile phone 1010 is taken as a search device, and other devices are taken as connectable devices as examples. The first mobile phone 1010 may establish a communication connection with one or more of the other devices. Other devices include a television 1020, a smart box 1030, a tablet computer 1040, and a second mobile phone 1050. It should be appreciated that in other application scenarios, the second mobile phone 1050 may act as a search device and the first mobile phone 1010 may act as a connectable device. The roles of the devices may be interchanged according to the actual situation of the application scenario, which is not limited in this application, and is only illustrated here as an example.
In the scenario illustrated in fig. 10, the first mobile phone 1010 is a search device that searches for surrounding connectable devices, and the first mobile phone 1010 may wirelessly communicate with any one or more of the searched surrounding connectable devices using wireless communication techniques supported by the device. Wireless communication technologies supported by the device include, but are not limited to, wi-Fi, BT, IR, GPS, high performance Wireless local area network (High Performance Radio LAN, hiperLAN), radio Frequency (RF), wireless USB (WUSB), UWB, or the like. It should be appreciated that in practical applications, other wireless communication techniques may be employed, or alternatively, wired communication techniques may be employed. In the description of the third application scenario, the first mobile phone 1010 may perform wireless communication with other peripheral devices, that is, the first mobile phone 1010 may establish wireless communication connection with the television 1020, the smart speaker 1030, the tablet computer 1040, and the second mobile phone 1050.
After the first mobile phone 1010 establishes wireless communication connection with the television 1020, the intelligent sound box 1030, the tablet computer 1040 and the second mobile phone 1050 respectively, data interaction between the first mobile phone 1010 and the television 1020, between the first mobile phone 1010 and the intelligent sound box 1030, between the first mobile phone 1010 and the tablet computer 1040, and between the first mobile phone 1010 and the second mobile phone 1050 can be achieved, and therefore control of the first mobile phone 1010 on the television 1020, the intelligent sound box 1030, the tablet computer 1040, the second mobile phone 1050 and the like can be achieved.
In a first implementation, the off-screen display interface of the first mobile phone 1010 may display a device identification of at least a portion of the connectable devices. The device identifier is used as a shortcut identifier. The connectable device is a connectable peripheral device searched by the first mobile phone 1010. The user may select a connectable device, referred to as a target device, in the message display interface of the first mobile phone 1010. The first mobile phone 1010 starts fingerprint recognition or face recognition according to a pressing operation of a user acting on a device identifier of a target device to unlock the first mobile phone 1010. After successful unlocking, the first mobile phone 1010 establishes a wireless communication connection with the target device and enables a shortcut associated with the target device.
In a second implementation, the device identification of at least a portion of the connected devices may be displayed in a screen-inactive display interface of the first mobile phone 1010. The device identifier is used as a shortcut identifier. The connected device is a peripheral device that establishes a wireless connection with the first mobile phone 1010. The user may select a connected device, referred to as a target device, in the message display interface of the first mobile phone 1010. The first mobile phone 1010 starts fingerprint recognition or face recognition according to a pressing operation of a user acting on a device identifier of a target device to unlock the first mobile phone 1010. After successful unlocking, the first mobile phone 1010 may enable a shortcut associated with the target device.
In the first and second implementations, the device identification includes, but is not limited to, a combination of one or more of a pattern and text, etc.
Patterns include, but are not limited to, regular geometric patterns, irregular patterns, pictures, engineering drawings, or the like. For example, the pattern may take the form of a physical or visual schematic of the connectable device, or the like. As another example, the pattern may also employ dots or any geometric pattern, etc.
Text includes, but is not limited to, combinations of one or more of letters, numbers, words, symbols (e.g., emoticons), and the like. For example, the text may employ a device name or device type of the connectable device, etc. The device type is such as a television, a display screen, a tablet computer or a notebook. As another example, the text may take the friendly name of the connectable device that is custom-defined by the holder of the device, i.e., a name that can be recognized by other devices.
In the case where the device identification employs a combination of patterns and text, in some implementations, the corresponding patterns and text for the same connectable device may be displayed independently. For example, the pattern employs an outline of the connectable device, below which the device name is displayed. In other implementations, the pattern and text corresponding to the same connectable device may be displayed in a fused manner. For example, the pattern takes a rectangular frame, and the device name is displayed within the rectangular frame.
It should be noted that, the device identifier may be displayed in a static manner, or may be displayed in a dynamic manner, for example, in a flashing manner.
It should be further noted that, the display area of the device identifier in the screen-extinguishing display interface may be set by default or may be set by user definition. The display area of the device identification may include an area above, below, left or right below the screen of the electronic device, and so on.
Based on the first implementation manner or the second implementation manner, in some implementation manners, device identifiers displayed in the information screen display interface correspond to the same electronic devices as the account logged in by the first mobile phone 1010.
Based on the first implementation manner or the second implementation manner, in some implementation manners, the first mobile phone 1010 may determine an upper limit of the number of device identifiers displayed in the screen-extinguishing display interface according to a default setting of a system or a user-defined setting.
Based on the first implementation manner or the second implementation manner, in some implementations, the distribution of the device identifiers in the information screen display interface may not be mapped according to the spatial relationship of the respective connectable devices. The distribution of the device identifiers in the screen-off display interface can be set by default or customized by a user.
The distribution of the device identifiers can be random distribution or regular interval distribution. The distribution of device identifications may form a regular geometric image, such as a straight line, triangle, or matrix. The distribution of device identifications may also form an irregular curve. For example, when the device identifications are three, the three device identifications are respectively distributed at three corners of the equilateral triangle; for another example, when the device identifications are four, the four device identifications are respectively distributed at four corners of the rectangle.
As a non-limiting example, based on the application scenario shown in fig. 10, as shown in fig. 11A, in addition to displaying time, date and electricity, device identifiers corresponding to each of the four connectable devices, that is, device identifiers corresponding to each of the television 1020, the smart speaker 1030, the tablet pc 1040 and the second mobile phone 1050, are displayed in the inactive display interface of the first mobile phone 1010. The distribution of four device identifications forms a regular rectangle. The device identifier 1102 corresponds to the television 1020, the device identifier 1103 corresponds to the smart box 1030, the device identifier 1104 corresponds to the tablet computer 1040, and the device identifier 1105 corresponds to the second mobile phone 1050. In the example shown in fig. 11A, in order to better enable the user to distinguish between the connectable devices, the operation efficiency and the operation accuracy are improved, and the device identifier adopts a device outline.
According to the system default setting or the shortcut path set by the user, when the first mobile phone 1010 detects that the user presses the area of the display device identifier 1102 in fig. 11A, the first mobile phone 1010 starts fingerprint recognition or face recognition to authenticate the user identity. In some embodiments, the first mobile phone 1010 unlocks, establishes a wireless communication connection with the television 1020, and enters the television control panel interface for user identity passing. The television control panel interface 1110 may be as shown in fig. 11B, and the user may control the television 1020 to turn on by clicking on the power-on control 1111 in the television control panel interface 1110. In other embodiments, the first mobile phone 1010 is unlocked to establish a wireless communication connection with the television 1020, and the television 1020 is controlled to be turned on according to the established wireless communication connection, when the user identity passes. In other embodiments, the first mobile phone 1010 unlocks the first mobile phone 1010, establishes a wireless communication connection with the television 1020, controls the television 1020 to turn on according to the established wireless communication connection, and enters the television control panel interface. It should be appreciated that what operation is performed by the first mobile phone 1010 after the user identity passes is determined by the system default settings or the user-defined set shortcut.
Based on the examples shown in fig. 11A and 11B, implementations in which the first mobile phone 1010 controls the television 1020 to be turned on may include the following two non-limiting examples.
First, the first mobile phone 1010 receives a user operation, for example, the user presses the device identifier 1102 corresponding to the television 1020 and successfully unlocks the first mobile phone 1010; for another example, the user clicks a start control in the control panel interface of the television, and the first mobile phone 1010 generates a corresponding start instruction, and based on the communication connection established between the first mobile phone 1010 and the television 1020, the first mobile phone 1010 sends a control instruction to the television 1020, so that the first mobile phone 1010 of the mobile phone controls the television 1020 to be turned on.
Second, the first mobile phone 1010 is operated according to a user, for example, the user presses the device identifier 1102 corresponding to the television 1020 and successfully unlocks the first mobile phone 1010; for another example, the user clicks a startup control in the control panel interface of the television, the first mobile phone 1010 generates a control instruction, the control instruction is sent to the cloud, for example, the smart home cloud, based on the communication connection established between the first mobile phone 1010 and the television 1020, obtains the current state of the television 1020, forwards the control instruction to the television 1020, obtains an execution result of the television 1020, and returns the execution result to the first mobile phone 1010.
As another non-limiting example, on the basis of the application scenario shown in fig. 10, as shown in fig. 12A, in addition to displaying time, date and electric quantity, in the screen-off display interface of the first mobile phone 1010, device identifiers corresponding to the 2 connected devices, that is, device identifiers corresponding to the smart speaker 1030 and the tablet computer 1040, are displayed. The distribution of 2 device identifications is random. The device identifier 1203 corresponds to the smart box 1030, and the device identifier 1204 corresponds to the tablet computer 1040. In the example shown in fig. 12A, in order to better enable the user to distinguish between the connectable devices, the device identifier adopts a device profile to improve the operation efficiency and the operation accuracy.
According to the system default setting or the user-defined shortcut, when the first mobile phone 1010 detects that the user presses the area of fig. 12A where the device identifier 1204 is displayed, the first mobile phone 1010 starts fingerprint recognition or face recognition to authenticate the user's identity. When the user identity passes, the first mobile phone 1010 is unlocked, and an unlocking interface of the first mobile phone 1010 and a display interface of the tablet computer 1040 are displayed. The unlocking interface of the first mobile phone 1010 and the display interface of the tablet computer 1040 can be displayed in a split screen manner or can be displayed in a superimposed manner. In the case of the overlay display, as an example, as shown in fig. 12B, the display interface 1214 of the tablet computer 1040 may be overlaid on the unlock interface 1211 of the first mobile phone 1010 in the form of a floating window.
In some embodiments, after the first mobile phone 1010 is successfully unlocked, a screen-throwing instruction is sent to the tablet computer 1040 according to the established wireless communication connection, and the tablet computer 1040 throws the display interface to the first mobile phone 1010 according to the screen-throwing instruction. The display interface of the tablet is turned up directly across devices on the first mobile phone 1010. The display interface of the tablet personal computer can be transferred to the mobile phone for display and operation.
It should be noted that, the display interface of the tablet computer with the first mobile phone 1010 being adjustable may be set by user definition of the tablet computer or default setting of the system of the tablet computer.
If the first mobile phone 1010 continues to play music in the locked state, when the first mobile phone 1010 detects that the user presses the area of the display device identifier 1203 in fig. 12A, the first mobile phone 1010 starts fingerprint recognition or face recognition to authenticate the user identity. When the user identity passes, the first mobile phone 1010 is unlocked, and the music being played by the first mobile phone 1010 is sounded to the intelligent sound 1030.
Based on the first implementation manner or the second implementation manner, in some implementations, the distribution of the device identifiers in the screen-extinguishing display interface may be mapped according to the spatial relationship of the electronic device. The spatial relationship includes a positional and/or directional spatial relationship.
The first mobile phone 1010 obtains a spatial relationship of peripheral electronic devices (including one or more of the television 1020, the smart box 1030, the tablet 1040, and the second mobile phone 1050) to the first mobile phone 1010. The spatial relationship includes a distance and an included angle. As an implementation manner, the distance refers to a relative or absolute distance between each peripheral electronic device and the first mobile phone. The included angle is the included angle between the connection line of each peripheral electronic device and the first mobile phone and the orientation of the first mobile phone. As an example, according to the UWB positioning technology and the directionality of the UWB technology, the distance and the included angle between the first mobile phone and the peripheral electronic device may be calculated. According to the directionality of the UWB technology, when the orientation of the first mobile phone changes, the included angle between the first mobile phone and the peripheral electronic equipment changes.
In the following embodiments or examples of the present application, for convenience of description, an angle between a line connecting the first mobile phone and the peripheral electronic device, which is rotated clockwise from the first mobile phone to the straight line, is taken as an exemplary description for calculating the size of the included angle. It should be understood that no specific limitation is thereby intended.
Referring to the application scenario shown in fig. 10, as shown in fig. 13A, a point a represents a positioning point of the first mobile phone 1010, a point B represents a positioning point of the television 1020, and a distance between the point a and the point B is a; point C represents the location point of tablet 1040, with a distance b between point A and point C. As shown in fig. 13A, assuming that the direction of the first mobile phone 1010 is the direction indicated by the arrow X in fig. 13A, the line of the arrow X is rotated clockwise by an angle θ1 to the line between the point a and the point B, and the angle θ1, that is, the angle between the line between the television 1020 and the first mobile phone 1010 and the line of the direction of the first mobile phone 1010, may be simply referred to as the angle between the first mobile phone 1010 and the television 1020. The line along which the arrow X is located rotates clockwise by an angle θ2 to the line between the point a and the point C, and the angle θ2, that is, the angle between the line between the tablet computer 1040 and the first mobile phone 1010 and the line along which the direction of the first mobile phone 1010 is located, may be simply referred to as the angle between the first mobile phone 1010 and the tablet computer 1040. The device identifiers in the screen-off display interface of the first mobile phone 1010 are laid out according to the spatial relationship of the peripheral electronic connection devices. Specifically, the distribution of the device identifiers is mapped according to a spatial relationship between the peripheral electronic device and the first mobile phone.
It should be noted that, in some implementations, the direction of the first mobile phone 1010 may be set as shown by an arrow X1 in fig. 13B, that is, the direction of the ray of the top of the first mobile phone along the long side is the direction of the first mobile phone. At this time, the angle between the first mobile phone 1010 and the television 1020 is θ1. In other implementations, the first mobile phone 1010 may be oriented in the direction indicated by arrow X2 in fig. 13C, where the angle between the first mobile phone 1010 and the television 1020 is θ2. It should be understood that the orientation of the first mobile phone may be set by default, or may be set by user definition. The orientation of the first mobile phone may be set according to the requirements and/or habits, and other settings may be used for the orientation of the first mobile phone besides the arrow directions shown in fig. 13B and 13C. For example, the direction indicated by the arrow X3 or X4 in fig. 13D. The orientation of the first mobile phone is not particularly limited in the present application. Preferably, in order to reduce the difficulty of use of the user, save the memory cost and improve the operation efficiency, the direction of the first mobile phone may be set to the direction indicated by the arrow X1 in fig. 13B. In a subsequent application scenario, embodiment, implementation or example of the present application, for convenience of description, the orientation of the mobile phone is taken as an example of a ray direction along the long side direction of the mobile phone. It should be understood that no specific limitation is to be construed as a particular limitation on the embodiments or implementations of the present application.
In fig. 13A, 13B, and 13C, the distance and/or angle between the first mobile phone 1010 and the television 1020 are described as examples, and it should be understood that the distances and angles between the smart stereo 1030, the tablet 1040, and the second mobile phone 1050, respectively, and the first mobile phone 1010 may be analogically described with reference to these examples.
In the case where the orientation of the first mobile phone 1010 is set to a certain direction, the orientation of the first mobile phone is determined, i.e., unchanged, with respect to the first mobile phone body. When the positions of the first mobile phone 1010 and the peripheral electronic devices are not changed and the body of the first mobile phone 1010 turns or rotates, the angle between the mobile phone 1010 and each peripheral electronic device may be changed because the orientation of the first mobile phone 1010 is unchanged relative to the body. Because the distribution of the device identifiers in the screen-extinguishing display interface is mapped according to the spatial relationship between the first mobile phone and the peripheral electronic device, and the spatial relationship comprises the distance and the included angle, the distribution of the device identifiers in the screen-extinguishing display interface changes along with the change of the included angle between the peripheral electronic device and the first mobile phone.
As a non-limiting example, the direction of the first handset is taken as the direction of the ray along the longitudinal direction of the handset. In connection with the application scenario shown in fig. 14, the orientation of the first mobile phone 1010 is adjusted from pointing to the television 1020 shown in fig. 14 a to pointing to the tablet 1040 shown in fig. 14B. In this process, the user rotates the body of the first mobile phone 1010 counterclockwise by an angle θ, and the direction of the first mobile phone 1010 is adjusted from pointing to the television 1020 to pointing to the tablet 1040. During the rotation of the body of the first mobile phone 1010, the included angle between the first mobile phone 1010 and the connectable four electronic devices changes, while the distance is unchanged. The screen-off display interface of the first mobile phone 1010 changes from that shown in a diagram in fig. 15 to that shown in B diagram in fig. 15. As shown in fig. 15, the first mobile phone 1010 is oriented in a radial direction along the long side of the mobile phone, i.e., in the direction shown by the black arrow in fig. 15. The screen-off display interface of the first mobile phone 1010 displays the device identifiers corresponding to the connectable four electronic devices. The device identification adopts a device outline drawing. The device identifier 1502 corresponds to the television 1020, the device identifier 1503 corresponds to the smart box 1030, the device identifier 1504 corresponds to the tablet computer 1040, and the device identifier 1505 corresponds to the second mobile phone 1050. As can be seen from fig. 15, the distribution or layout of the device identifiers changes as the spatial relationship between the first handset and the peripheral electronic device changes.
More generally, when the location of at least one of the first mobile phone 1010 and the connectable device changes and/or the angle between the first mobile phone 1010 and any one or more of the connectable devices changes, the distribution of the connectable devices in the message display interface displayed by the first mobile phone 1010 changes with these changes.
In another non-limiting example, the message display interface may also include an orientation identification of the first mobile phone 1010. The orientation mark can be presented in a static form such as a line or an arrow, or in a dynamic form such as a flashing line or an arrow. The orientation mark can be synchronous with the display of the information screen display interface, and can be shorter than the display time of the information screen display interface. For example, the heading indicator is displayed for a duration of time beginning with the display of the rest screen display interface and then disappears. As another example, the orientation indicator is displayed during the period of time that the user rotates, turns, or moves the first handset, and then disappears.
For example, with continued reference to the application scenario shown in fig. 14, as shown in fig. 16, the first mobile phone 1010 is oriented in a direction along the long side of the mobile phone, i.e., in a direction indicated by a black arrow. The orientation of the first cell phone 1010 in the off-screen display interface is identified as a white line 1611. The information screen display interface displays four connectable devices searched by the first mobile phone 1010, and the device identifiers are in rectangular patterns. The device identifier 1602 corresponds to the television 1020, the device identifier 1603 corresponds to the smart box 1030, the device identifier 1604 corresponds to the tablet computer 1040, and the device identifier 1605 corresponds to the second mobile phone 1050. The body of the first mobile phone 1010 rotates counterclockwise by an angle, and the direction of the first mobile phone 1010 is adjusted from pointing to the television 1020 to pointing to the tablet 1040. In the screen-off display interface shown in a of fig. 16, the device identifier 1602 corresponding to the television 1020 is located on a white line 1611. In the screen-rest display interface shown in B of fig. 16, the device identifier 1604 corresponding to the tablet computer is located on the white line 1611. As can be seen from fig. 16, the orientation of the off-screen display interface, i.e., the position of the white line, does not change as the spatial relationship of the first mobile phone 1010 to the connectable device changes. But the display position of the device identifier corresponding to each connectable device is changed. Because the orientation mark is displayed in the screen-extinguishing display interface, the user can more intuitively confirm the equipment pointed by the orientation of the mobile phone, the distribution or layout of the equipment mark can be conveniently corresponding to the actual scene, the user can conveniently and accurately and efficiently select the target equipment, and the shortcut path aiming at the target equipment is started.
In the examples shown in fig. 15 and 16, the user may press (e.g., click or press for a long time) a screen area displaying any device identifier, and after unlocking the mobile phone, the mobile phone enables a shortcut path corresponding to the device identifier.
In the examples shown in fig. 15 and 16, the device identification (i.e., the shortcut identification) in the screen-extinguishing display interface is displayed in the upper area of the screen. The display area of the shortcut path identifier can be set according to system setting or user-defined setting. In some embodiments, the mobile phone screen is larger, so as to facilitate user operation, improve operation efficiency, and the shortcut path identifier in the screen-extinguishing display interface may be displayed in the area below the screen according to default settings of the system or user-defined settings. In other embodiments, the mobile phone screen is larger, so as to further facilitate the user operation, thereby further providing the operation efficiency, and the shortcut identifier may be displayed in the left-hand or right-hand area below the screen according to the detected holding state of the left hand or the right hand. In other embodiments, the shortcut identifier in the information screen display interface may change the display area, and similarly, other content in the information screen display interface may also change the display area, for example, change the display position of the displayed content once every preset time period; for another example, the display position of the content displayed in the screen-extinguishing display interface is different from the display position of the last time. It is to be understood that this is an exemplary description only and is not to be construed as a specific limitation upon the present application.
Aiming at the implementation modes of mapping the distribution of the device identifiers in the screen-extinguishing display interface according to the spatial relationship of each connectable device, based on the implementation modes, the first mobile phone 1010 can acquire preset conditions according to default settings of the system or user-defined settings, and display the device identifiers of the peripheral devices meeting the preset conditions in the screen-extinguishing display interface.
As a non-limiting example, the user custom sets the preset condition to a maximum deviation angle of the peripheral device, for example, 15 ° (angle units: degrees). Thus, the deviation angle between the peripheral device corresponding to the device identifier displayed in the screen-off display interface of the first mobile phone 1010 and the first mobile phone 1010 is within 15 °. That is, when the included angle between the peripheral device and the first mobile phone 1010 is within 15 ° or within 345 ° to 360 °, the device identifier corresponding to the peripheral device is displayed on the screen-off display interface. Through the arrangement, the number of the peripheral devices displayed in the screen-extinguishing display interface is reduced, misoperation caused by inaccurate distinction of the peripheral devices by a user is avoided, and the operation accuracy is improved.
As another non-limiting example, the user custom sets the preset condition to a maximum deviation angle of the peripheral device, for example, 15 ° (angle units: degrees), and the upper limit of the number of device identifications is 2. Therefore, at most 2 device identifiers corresponding to the peripheral devices can be displayed in the screen-off display interface of the first mobile phone 1010, and the deviation angle between the peripheral devices corresponding to the displayed device identifiers and the first mobile phone 1010 is within 15 °. In some examples, if there are exactly 2 peripheral devices within 15 ° or 345 ° to 360 ° (i.e., within-15 ° to +15°) of the first mobile phone 1010, the device identifications corresponding to the 2 peripheral devices are displayed in the off-screen display interface. In other examples, if there are more than 2 peripheral devices with an included angle within 15 ° or 345 ° to 360 ° (i.e., within-15 ° to +15°) with the first mobile phone 1010, the device identifications corresponding to the two peripheral devices with the smallest deviation angle are displayed in the off-screen display interface.
Fourth application scenario
In some possible implementations, the mobile phone may determine, according to the usage habit and/or scene information of the user, a scheme for adopting the first application scene (hereinafter referred to as a first shortcut scheme), that is, display a shortcut identifier for an application or a function in the mobile phone in the screen-extinguishing display interface; and a third application scenario scheme (hereinafter referred to as a second shortcut scheme) is adopted, namely, displaying the shortcut identifier for the peripheral equipment in the screen-extinguishing display interface. Wherein the scene information includes position information and the like. In a fourth application scenario, switching of different shortcut schemes is achieved.
As a non-limiting example, the mobile phone may count usage rules of the first and second shortcut schemes according to usage records of the two schemes in a certain period of time. For example, the statistics are: the concentrated use of the second shortcut scheme in one day includes 2 time periods, 8:00 to 9:00, and 10:00 to 11:00. When the mobile phone detects that the current time is in the 2 time periods, a second shortcut scheme is adopted, namely the shortcut identification aiming at the peripheral equipment is displayed in the screen-extinguishing display interface. And when the current time is in the 2 time periods, a first shortcut scheme is adopted, namely, shortcut identifiers for applications or functions in the mobile phone are displayed in the screen-extinguishing display interface.
As another non-limiting example, the handset may determine the current location from the position fix, and determine the scene in which the user is located from the current location, such as a home or office scene. When the mobile phone determines that the user is in a home or office and other scenes, the mobile phone is switched to a second shortcut scheme, namely the shortcut identifier aiming at the peripheral equipment is displayed in the screen-extinguishing display interface. In other scenes except home or offices, the mobile phone is switched to a first shortcut scheme, namely, shortcut identifiers for applications or functions in the mobile phone are displayed in a screen-extinguishing display interface.
It should be noted that, whether to switch the shortcut scheme may be set by default or user-defined. The mobile phone can automatically switch the shortcut scheme according to the using habit of the user and/or scene information and the like, and can also switch the shortcut scheme according to manual operation of the user.
In combination with the application scenario and related drawings, an embodiment of the present application provides an unlocking method, which may be executed by an electronic device. For example, the unlocking method may be performed by a mobile phone in the aforementioned application scenario. For another example, in other practical application scenarios, the unlocking method may also be performed by an electronic device such as a tablet computer, a television, or a smart speaker with a touch display screen. As shown in fig. 17, the unlocking method is applied to an electronic apparatus, and includes steps S1710 to S1730.
S1710, receiving touch operation input by a user in a screen-extinguishing display interface, wherein the screen-extinguishing display interface comprises one or more shortcut identifiers, and the shortcut identifiers are identifiers of applications or functions in corresponding electronic equipment.
Wherein the shortcut path identification comprises patterns and/or texts and the like.
And the electronic equipment determines the application or function corresponding to each shortcut identifier according to the default setting of the system or the user-defined setting.
S1720, if the touch operation is determined to act on the target shortcut identifier, starting user identity authentication; the target shortcut identifier is any one of the one or more shortcut identifiers.
Wherein the user identity authentication comprises fingerprint recognition and/or face recognition based identity authentication and the like. The electronic equipment inputs fingerprint information and/or face information of the user in advance so as to carry out subsequent identity authentication.
When the identity authentication based on fingerprint identification is adopted, the touch operation input by the user in the screen-extinguishing display interface can be a finger pressing operation. On the one hand, the electronic device can determine whether the finger of the user acts on a certain shortcut path identifier, namely a target shortcut path identifier; on the other hand, the user fingerprint is collected for identity authentication.
It should be noted that, in some implementations, if it is determined that the touch operation does not act on any shortcut identifier, the electronic device does not unlock and continues to display the information screen display interface. In some implementations, if it is determined that the touch operation does not act on any shortcut identifier, the electronic device may start the camera to perform face recognition, and after the face recognition passes, the electronic device unlocks and enters the unlock interface. In some implementations, fingerprint patterns may be displayed in the screen-in display interface to prompt the user to perform fingerprint identification to unlock the electronic device, if it is determined that the touch operation acts on the fingerprint patterns displayed in the screen-in display interface, the electronic device starts fingerprint identification, and if the fingerprint identification passes, the electronic device unlocks and enters the unlock interface.
S1730, unlocking and displaying the interface of the application or function corresponding to the target shortcut identifier if the user identity authentication passes.
And after the user identity authentication passes, the electronic equipment unlocks and displays an interface of the application or the function corresponding to the target shortcut identifier, and the user can directly use the function or the application.
It should be noted that, in some implementations, if the user identity authentication is not passed, the electronic device does not unlock and continues to display the information screen display interface.
The embodiment divides a plurality of areas on the screen-off display interface, and each area is used as a shortcut path for entering a specific application or function. The user can enter specific application or function through fingerprint unlocking or face recognition, and the purpose of improving the operation efficiency of the user is achieved.
Another embodiment of the present application provides an unlocking method, which may be performed by an electronic device. For example, the unlocking method may be performed by a mobile phone in the aforementioned application scenario. For another example, in other practical application scenarios, the unlocking method may also be performed by a tablet computer, a television, or a smart speaker with a touch display screen. As shown in fig. 18, the unlocking method is applied to an electronic apparatus, and includes steps S1810 to S1830.
S1810, receiving touch operation input by a user in a screen-extinguishing display interface, wherein the screen-extinguishing display interface comprises one or more device identifiers, and the device identifiers are identifiers of peripheral devices connectable with electronic equipment and/or connected peripheral devices.
Wherein the device identification comprises a pattern and/or text or the like.
And the electronic equipment determines the shortcut path of the peripheral equipment corresponding to each equipment identifier according to the default setting of the system or the user-defined setting.
S1820, if the touch operation is determined to act on the target equipment identifier, starting user identity authentication; the target device identity is any one of the one or more device identities.
Wherein the user identity authentication comprises fingerprint recognition and/or face recognition based identity authentication and the like. The electronic equipment inputs fingerprint information and/or face information of the user in advance so as to carry out subsequent identity authentication.
When the identity authentication based on fingerprint identification is adopted, the touch operation input by the user in the screen-extinguishing display interface can be a finger pressing operation. On the one hand, the electronic device can determine whether the user finger acts on a certain device identifier, namely a target device identifier; on the other hand, the user fingerprint is collected for identity authentication.
It should be noted that, in some implementations, if it is determined that the touch operation does not act on any device identifier, the electronic device does not unlock, and continues to display the information screen display interface. In some implementations, if it is determined that the touch operation does not act on any device identifier, the electronic device may start the camera to perform face recognition, and after the face recognition passes, the electronic device unlocks and enters the unlock interface. In some implementations, fingerprint patterns may be displayed in the screen-in display interface to prompt the user to perform fingerprint identification to unlock the electronic device, if it is determined that the touch operation acts on the fingerprint patterns displayed in the screen-in display interface, the electronic device starts fingerprint identification, and if the fingerprint identification passes, the electronic device unlocks and enters the unlock interface.
S1830, unlocking and starting the shortcut corresponding to the target equipment identifier if the user identity authentication passes.
And when the user identity authentication passes, the electronic equipment is unlocked, and a shortcut corresponding to the target equipment identification is started.
It should be appreciated that the shortcut is associated with a target device corresponding to the target device identification.
The shortcuts include, but are not limited to: the control target device is turned on, a control panel interface of the target device is displayed, a display interface of the target device is called up, a screen is thrown or a sound is thrown to the target device, and the like.
It should be noted that, in some implementations, if the user identity authentication is not passed, the electronic device does not unlock and continues to display the information screen display interface.
In this embodiment, device identifiers of one or more peripheral devices are displayed on the screen-off display interface, and each device identifier is used as a shortcut for entering a specific peripheral device. The user can quickly start a shortcut for any peripheral device through fingerprint unlocking or face recognition, and the shortcut is used for controlling the peripheral device, entering a control interface of the peripheral device, calling up an interface of the peripheral device, throwing a screen or a sound, and the like. The cross-equipment interaction is conveniently and rapidly realized under the screen-off condition, and the purpose of improving the operation efficiency of the user is achieved.
It should be understood that the order of execution of the processes in the above embodiments should be determined by their functions and inherent logic, and should not be construed as limiting the implementation of the embodiments of the present application.
Corresponding to the unlocking method described in the above embodiment, the embodiment of the present application further provides an unlocking device. Each module included in the unlocking device can correspond to each step of the unlocking method.
It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware and/or software modules that perform the respective functions. The present application can be realized in hardware or a combination of hardware and computer software in conjunction with the description of the embodiments disclosed herein. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application in connection with the embodiments, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It should be noted that, because the content of information interaction and execution process between the modules/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and details thereof are not repeated herein.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
The embodiment of the application also provides electronic equipment, which comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor executes the computer program to enable the electronic equipment to realize the steps in the method embodiments.
As an example, the electronic device may include a wearable device, a cell phone, or a tablet computer, etc.
Embodiments of the present application also provide a computer readable storage medium storing a computer program, which when executed by a processor, may implement the steps in the above-described method embodiments.
Embodiments of the present application provide a computer program product which, when run on an electronic device, causes the electronic device to perform the steps of the method embodiments described above.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application implements all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a camera device/electronic apparatus, a recording medium, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed electronic device and method may be implemented in other manners. For example, the electronic device embodiments described above are merely illustrative. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (20)

1. An unlocking method applied to electronic equipment is characterized by comprising the following steps:
receiving touch operation input by a user in a screen-off display interface, wherein the screen-off display interface comprises one or more equipment identifiers, and the equipment identifiers are identifiers of peripheral equipment which can be connected with electronic equipment and/or connected peripheral equipment;
If the touch operation is determined to act on the target equipment identifier, starting user identity authentication; the target device identifier is any one of the one or more device identifiers;
unlocking and starting a shortcut corresponding to the target equipment identifier if the user identity authentication is passed;
the enabling the shortcut corresponding to the target equipment identifier comprises the following steps:
calling up a display interface of the target equipment corresponding to the target equipment identifier;
the method for calling up the display interface of the target equipment corresponding to the target equipment identifier comprises the following steps:
and displaying the target equipment in a split screen or overlapped display mode on an unlocking interface, so that a user and the target equipment perform cross-equipment interaction.
2. The unlocking method of claim 1, wherein the distribution of the one or more device identifications in the information screen display interface is mapped according to a spatial relationship between each of the peripheral devices and the electronic device.
3. Unlocking method according to claim 2, characterized in that the spatial relationship comprises a positioned and/or oriented spatial relationship.
4. The unlocking method according to claim 3, wherein in a case where the spatial relationship includes a spatial relationship of positioning and orientation, the spatial relationship includes a distance between each of the peripheral devices and the electronic device, and an angle between a line connecting each of the peripheral devices and the electronic device and an orientation of the electronic device.
5. The method of unlocking according to claim 4, wherein the screen-off display interface further comprises an orientation identification of the electronic device.
6. The unlocking method according to any one of claims 1 to 5, wherein the enabling of the shortcut corresponding to the target device identification comprises:
controlling target equipment corresponding to the target equipment identifier to respond to a preset instruction; or alternatively, the first and second heat exchangers may be,
displaying a control panel interface of the target equipment corresponding to the target equipment identifier; or alternatively, the first and second heat exchangers may be,
and projecting a screen or a sound to the target equipment corresponding to the target equipment identifier.
7. The unlocking method according to any one of claims 1 to 5, wherein the touch operation comprises a finger press operation, and the user authentication comprises fingerprint-based user authentication.
8. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor, when executing the computer program, causes the electronic device to carry out the steps of:
receiving touch operation input by a user in a screen-off display interface, wherein the screen-off display interface comprises one or more equipment identifiers, and the equipment identifiers are identifiers of peripheral equipment which can be connected with electronic equipment and/or connected peripheral equipment;
If the touch operation is determined to act on the target equipment identifier, starting user identity authentication; the target device identifier is any one of the one or more device identifiers;
unlocking and starting a shortcut corresponding to the target equipment identifier if the user identity authentication is passed;
the enabling the shortcut corresponding to the target equipment identifier comprises the following steps:
calling up a display interface of the target equipment corresponding to the target equipment identifier;
the method for calling up the display interface of the target equipment corresponding to the target equipment identifier comprises the following steps:
and displaying the target equipment in a split screen or overlapped display mode on an unlocking interface, so that a user and the target equipment perform cross-equipment interaction.
9. The electronic device of claim 8, wherein the distribution of the one or more device identifications in the message display interface is mapped according to a spatial relationship of each of the peripheral devices to the electronic device.
10. The electronic device of claim 9, wherein the spatial relationship comprises a positional and/or directional spatial relationship.
11. The electronic device of claim 10, wherein in the case where the spatial relationship comprises a positional and directional spatial relationship, the spatial relationship comprises a distance between each of the peripheral devices and the electronic device, and an angle between a line connecting each of the peripheral devices and the electronic device and an orientation of the electronic device.
12. The electronic device of claim 11, wherein the information screen display interface further comprises an orientation identification of the electronic device.
13. The electronic device of any of claims 8-12, wherein the enabling of the shortcut corresponding to the target device identification comprises:
controlling target equipment corresponding to the target equipment identifier to respond to a preset instruction; or alternatively, the first and second heat exchangers may be,
displaying a control panel interface of the target equipment corresponding to the target equipment identifier; or alternatively, the first and second heat exchangers may be,
and projecting a screen or a sound to the target equipment corresponding to the target equipment identifier.
14. The electronic device of any of claims 8-12, wherein the touch operation comprises a finger press operation and the user authentication comprises fingerprint-based user authentication.
15. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor performs the steps of:
receiving touch operation input by a user in a screen-off display interface, wherein the screen-off display interface comprises one or more equipment identifiers, and the equipment identifiers are identifiers of peripheral equipment which can be connected with electronic equipment and/or connected peripheral equipment;
If the touch operation is determined to act on the target equipment identifier, starting user identity authentication; the target device identifier is any one of the one or more device identifiers;
unlocking and starting a shortcut corresponding to the target equipment identifier if the user identity authentication is passed;
the enabling the shortcut corresponding to the target equipment identifier comprises the following steps:
calling up a display interface of the target equipment corresponding to the target equipment identifier;
the method for calling up the display interface of the target equipment corresponding to the target equipment identifier comprises the following steps:
and displaying the target equipment in a split screen or overlapped display mode on an unlocking interface, so that a user and the target equipment perform cross-equipment interaction.
16. The computer-readable storage medium of claim 15, wherein the one or more device identifications are distributed in the message display interface according to a spatial relationship of each of the peripheral devices to the electronic device.
17. The computer readable storage medium of claim 16, wherein the spatial relationship comprises a positional and/or directional spatial relationship.
18. The computer-readable storage medium of claim 17, wherein in the case where the spatial relationship comprises a positional and directional spatial relationship, the spatial relationship comprises a distance between each of the peripheral devices and the electronic device, and an angle between a connection of each of the peripheral devices and the electronic device and an orientation of the electronic device.
19. The computer-readable storage medium of any of claims 15 to 18, wherein the enabling the shortcut corresponding to the target device identification comprises:
controlling target equipment corresponding to the target equipment identifier to respond to a preset instruction; or alternatively, the first and second heat exchangers may be,
displaying a control panel interface of the target equipment corresponding to the target equipment identifier; or alternatively, the first and second heat exchangers may be,
and projecting a screen or a sound to the target equipment corresponding to the target equipment identifier.
20. The computer readable storage medium of any of claims 15 to 18, wherein the touch operation comprises a finger press operation and the user authentication comprises fingerprint-based user authentication.
CN202010911832.4A 2020-09-02 2020-09-02 Unlocking method and electronic equipment Active CN114201738B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010911832.4A CN114201738B (en) 2020-09-02 2020-09-02 Unlocking method and electronic equipment
PCT/CN2021/113610 WO2022048453A1 (en) 2020-09-02 2021-08-19 Unlocking method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010911832.4A CN114201738B (en) 2020-09-02 2020-09-02 Unlocking method and electronic equipment

Publications (2)

Publication Number Publication Date
CN114201738A CN114201738A (en) 2022-03-18
CN114201738B true CN114201738B (en) 2023-04-21

Family

ID=80491580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010911832.4A Active CN114201738B (en) 2020-09-02 2020-09-02 Unlocking method and electronic equipment

Country Status (2)

Country Link
CN (1) CN114201738B (en)
WO (1) WO2022048453A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116700473B (en) * 2022-09-13 2024-04-05 荣耀终端有限公司 Display method and device for screen-extinguishing display and terminal equipment
CN117131555A (en) * 2023-04-28 2023-11-28 荣耀终端有限公司 Information display method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140081642A (en) * 2012-12-21 2014-07-01 삼성전자주식회사 Method and system for controlling for external apparatus
CN108243281A (en) * 2017-12-27 2018-07-03 深圳信炜科技有限公司 The fingerprint identification method of electronic equipment
CN108958582A (en) * 2018-06-28 2018-12-07 维沃移动通信有限公司 A kind of application program launching method and terminal
CN110597473A (en) * 2019-07-30 2019-12-20 华为技术有限公司 Screen projection method and electronic equipment
CN111459388A (en) * 2020-04-13 2020-07-28 深圳康佳电子科技有限公司 Information screen display method, display equipment and storage medium for smart home information

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140181683A1 (en) * 2012-12-21 2014-06-26 Samsung Electronics Co., Ltd. Method and system for controlling external device
CN111328051B (en) * 2020-02-25 2023-08-29 上海银基信息安全技术股份有限公司 Digital key sharing method and device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140081642A (en) * 2012-12-21 2014-07-01 삼성전자주식회사 Method and system for controlling for external apparatus
CN108243281A (en) * 2017-12-27 2018-07-03 深圳信炜科技有限公司 The fingerprint identification method of electronic equipment
CN108958582A (en) * 2018-06-28 2018-12-07 维沃移动通信有限公司 A kind of application program launching method and terminal
CN110597473A (en) * 2019-07-30 2019-12-20 华为技术有限公司 Screen projection method and electronic equipment
CN111459388A (en) * 2020-04-13 2020-07-28 深圳康佳电子科技有限公司 Information screen display method, display equipment and storage medium for smart home information

Also Published As

Publication number Publication date
WO2022048453A1 (en) 2022-03-10
CN114201738A (en) 2022-03-18

Similar Documents

Publication Publication Date Title
CN111666119B (en) UI component display method and electronic device
CN115866121B (en) Application interface interaction method, electronic device and computer readable storage medium
CN113794801B (en) Method and device for processing geo-fence
US11474664B2 (en) Application icon moving method and electronic device
CN113115439B (en) Positioning method and related equipment
CN114079893B (en) Bluetooth communication method, terminal device and computer readable storage medium
CN114125130B (en) Method for controlling communication service state, terminal device and readable storage medium
CN116360725B (en) Display interaction system, display method and device
CN113805487B (en) Control instruction generation method and device, terminal equipment and readable storage medium
CN112130788A (en) Content sharing method and device
WO2020024108A1 (en) Application icon display method and terminal
CN114201738B (en) Unlocking method and electronic equipment
US11921968B2 (en) Method for interaction between devices based on pointing operation, and electronic device
CN111492678B (en) File transmission method and electronic equipment
CN114095542B (en) Display control method and electronic equipment
CN115914461B (en) Position relation identification method and electronic equipment
CN114356195B (en) File transmission method and related equipment
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
CN110609650B (en) Application state switching method and terminal equipment
WO2022062902A1 (en) File transfer method and electronic device
CN114489876A (en) Text input method, electronic equipment and system
CN114764300B (en) Window page interaction method and device, electronic equipment and readable storage medium
CN116048236B (en) Communication method and related device
CN116095223B (en) Notification display method and terminal device
CN114205318B (en) Head portrait display method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant