CN117453086A - Sensitive resource access behavior recording method and electronic equipment - Google Patents

Sensitive resource access behavior recording method and electronic equipment Download PDF

Info

Publication number
CN117453086A
CN117453086A CN202210847377.5A CN202210847377A CN117453086A CN 117453086 A CN117453086 A CN 117453086A CN 202210847377 A CN202210847377 A CN 202210847377A CN 117453086 A CN117453086 A CN 117453086A
Authority
CN
China
Prior art keywords
electronic device
interface
sensitive
application
access
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210847377.5A
Other languages
Chinese (zh)
Inventor
尹泉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210847377.5A priority Critical patent/CN117453086A/en
Publication of CN117453086A publication Critical patent/CN117453086A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/604Tools and structures for managing or administering access control systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a method for recording access behaviors of sensitive resources and an electronic device, which realize that after a user authorizes an application program to access one or more sensitive resources, the electronic device 100 can record the behaviors of each application program to access the sensitive resources. Among other things, the behavior of an application to access sensitive resources includes, but is not limited to: the type of sensitive resource accessed by the application, the time the sensitive resource is accessed by the application, the duration of the sensitive resource accessed by the application, the number of times the sensitive resource is accessed by the application, and the like. In this way, the first electronic device can show the behavior of one or more application programs for accessing each privacy resource under different running states to the user, so that the user can simply, conveniently, comprehensively and intuitively know the behavior of the application programs.

Description

Sensitive resource access behavior recording method and electronic equipment
Technical Field
The application relates to the technical field of terminals, in particular to a sensitive resource access behavior recording method and electronic equipment.
Background
With the development of technology, the functions of electronic devices are more and more abundant, and applications installed on the electronic devices are more and more. In the process that a user uses an application installed on an electronic device, the application needs to acquire authorization of the user to access sensitive resources. How to simply, conveniently, comprehensively and intuitively show the behavior of the application to access the sensitive resources to the user needs to be further studied.
Disclosure of Invention
The application provides a sensitive resource access behavior recording method and electronic equipment, which realize that a first electronic equipment can show the behavior of one or more application programs for accessing each privacy resource under different running states to a user, so that the user can know the behavior of the application program simply, conveniently, comprehensively and intuitively.
In a first aspect, the present application provides a method for recording access behavior of a sensitive resource, where the method includes: the first electronic device displays a first interface, wherein a first graph is displayed on the first interface, and the first graph is used for representing the times that one or more application programs access a first sensitive resource on the first electronic device; the first electronic device receives a first operation for a first interface; in response to the first operation, the first electronic device displays a second interface in which a plurality of behavior records of one or more application programs accessing a first sensitive resource on the first electronic device are displayed; wherein the behavior record includes an operating state of the one or more applications, the operating state of the one or more applications including: the screen locking front stage running state, the screen locking back stage running state, the non-screen locking front stage running state and the non-screen locking back stage running state.
Alternatively, the operating states of the one or more applications may include: the system comprises a screen locking running state, a non-screen locking foreground running state and a non-screen locking background running state. That is, in the lock state, it is no longer possible to distinguish whether the application is running in the foreground or in the background.
Alternatively, the operating states of the one or more applications may include: a screen locking state and a non-screen locking state. That is, in the lock state and the non-lock state, it is no longer possible to distinguish whether the application is running in the foreground or in the background.
Alternatively, the operating states of the one or more applications may include: foreground running state and background running state. That is, it is no longer discriminated whether the application is in the lock state or the non-lock state.
By the method for recording the access behaviors of the sensitive resources, which is provided by the first aspect, the first electronic device can display the behaviors of one or more application programs for accessing each privacy resource under different running states to the user, so that the user can know the behaviors of the application programs simply, conveniently, comprehensively and intuitively.
With reference to the first aspect, in one possible implementation manner, the behavior record further includes one or more of the following: the name of the one or more applications, the time the one or more applications accessed the first sensitive resource, the number of times the one or more applications accessed the first sensitive resource.
Alternatively, the time for one or more applications to access the first sensitive resource may be a period of time, such as a first time to a second time. In this way, the redundancy of the first electronic device displaying information on the user interface may be reduced.
With reference to the first aspect, in one possible implementation manner, the second interface further displays a first control, and after the first electronic device displays the second interface, the method further includes: the first electronic device receives a second operation for the first control; in response to the second operation, the first electronic device displays a third interface, wherein the third interface displays a plurality of behavior records of the running state of one or more application programs for accessing the first sensitive resource on the first electronic device under the first running state.
In this way, the user can filter out a record of behavior that causes the first electronic device to only display access to sensitive resources by the application under a certain operating state.
With reference to the first aspect, in one possible implementation manner, a second control is further displayed on the first interface; after the first electronic device displays the first interface, the method further comprises: the first electronic device receives a third operation for the second control; in response to the third operation, the first electronic device ceases displaying the second control and the first graphic on the first interface.
In this way, the user may operate to cause the first electronic device to cease displaying the number of times the one or more applications access the first sensitive resource on the first electronic device, protecting the privacy of the user.
With reference to the first aspect, in one possible implementation manner, an icon of the first application program is further displayed on the first interface; after the first electronic device displays the first interface, the method further comprises: the first electronic device receives a fourth operation of the icon for the first application program; in response to the fourth operation, the first electronic device displays a second graphic on the first interface, the second graphic excluding the number of times the first application accesses the first sensitive resource on the first electronic device.
In this way, the user may operate such that the first electronic device does not display on the interface the number of times that a portion of the application (e.g., the first application program) accesses the first sensitive resource on the first electronic device, protecting the user's privacy.
With reference to the first aspect, in one possible implementation manner, the first interface further displays an option of a first sensitive resource; after the first electronic device displays the first interface, the method further comprises: the first electronic device receiving a fifth operation for an option of the first sensitive resource; in response to the fifth operation, the first electronic device stops displaying the first graphic.
In this way, the user may operate such that the first electronic device does not display on the interface the number of times the application accesses a certain privacy resource (e.g., a first sensitive resource) on the first electronic device, protecting the user's privacy.
Alternatively, after the first electronic device stops displaying the first graphic, the first electronic device may automatically display the number of times the one or more applications access another sensitive resource on the first electronic device.
With reference to the first aspect, in one possible implementation manner, the first interface further displays an option of a second sensitive resource; after the first electronic device displays the first interface, the method further comprises: the first electronic device receiving a sixth operation for the option of the second sensitive resource; in response to the sixth operation, the first electronic device displays a third graphic on the first interface, the third graphic representing a number of times the one or more applications access the second sensitive resource on the first electronic device.
In this way, the first electronic device may switch to display the number of times one or more applications access different privacy resources on the first electronic device.
With reference to the first aspect, in one possible implementation manner, a name of the second sensitive resource in the first language is displayed in an option of the second sensitive resource.
With reference to the first aspect, in one possible implementation manner, the method further includes: after the first electronic device switches the system language from the first language to the second language, the first electronic device displays the name of the second sensitive resource of the second language in the options of the second sensitive resource, wherein the length of the name of the second sensitive resource of the second language is greater than the length of the name of the second sensitive resource of the first language, wherein the length of the options of the second sensitive resource in the second language is greater than the length of the options of the second sensitive resource in the first language, and/or the font size of the name of the second sensitive resource in the second language is smaller than the font size of the name of the second sensitive resource in the first language.
Optionally, icons of the sensitive resources are also displayed in the options of the sensitive resources.
Therefore, after the first electronic equipment switches the system language, the first electronic equipment can adaptively adjust the font size of the name of the sensitive resource and/or the length of the option of the sensitive resource, so that the first electronic equipment can completely display the name of the sensitive resource.
With reference to the first aspect, in one possible implementation manner, after the first electronic device displays the first interface, the method further includes: after the first electronic device detects a seventh operation aiming at the first interface, displaying a third interface, wherein a plurality of behavior records of the first electronic device accessing one or more sensitive resources on the first external device are displayed in the third interface; wherein the behavioral record includes one or more of the following: the method comprises the steps of operating state of the first electronic device, name of the first external device, time of the first electronic device accessing one or more sensitive resources on the first external device, and number of times of the first electronic device accessing the one or more sensitive resources on the first external device.
In this way, in a multi-device interconnection scenario, the first electronic device may switch to display multiple behavior records of the first electronic device accessing one or more sensitive resources on the first external device.
With reference to the first aspect, in one possible implementation manner, after the first electronic device displays the first interface, the method further includes:
after the first electronic device detects the eighth operation aiming at the first interface, displaying a fourth interface, wherein a plurality of behavior records of one or more external devices accessing the first sensitive resource on the first electronic device are displayed in the fourth interface;
wherein the behavioral record includes one or more of the following: the method comprises the steps of operating state of the first electronic device, names of one or more external devices, time of the one or more external devices accessing the first sensitive resource on the first electronic device, and number of times of the one or more external devices accessing the first sensitive resource on the first electronic device.
In this way, in the scenario of multi-device interconnection, the first electronic device may switch and display multiple behavior records of one or more external devices accessing one or more sensitive resources on the first electronic device.
With reference to the first aspect, in one possible implementation manner, the type of the first sensitive resource includes any one of the following: location information, cameras, microphones, telephones, address books, text messages, storage, calendars, memos, sports health data, photo albums, media and files, music, bluetooth.
With reference to the first aspect, in one possible implementation manner, the type of the external device is any one of the following: cell-phone, intelligent bracelet, car machine, bluetooth headset, intelligent large-size screen, intelligent wrist-watch.
In a second aspect, the present application provides a method for recording access behavior of a sensitive resource, where the method includes: the first electronic device displays a fifth interface, and a fourth graph is displayed on the fifth interface, wherein the fourth graph is used for representing the number of times the first electronic device accesses one or more sensitive resources on the first external device; the first electronic device receives a ninth operation for the fifth interface; responding to the ninth operation, the first electronic device displays a third interface, and a plurality of behavior records of the first electronic device accessing one or more sensitive resources on the first external device are displayed in the third interface; wherein the behavioral record includes one or more of the following: the method comprises the steps of operating state of the first electronic device, name of the first external device, time of the first electronic device accessing one or more sensitive resources on the first external device, and number of times of the first electronic device accessing the one or more sensitive resources on the first external device.
In this way, the first electronic device may also display only a plurality of behavior records of the first electronic device accessing one or more sensitive resources on the external device.
Optionally, one or more possible implementation manners provided in the first aspect are also applicable to a method for recording a sensitive resource access behavior provided in the second aspect, and this embodiment of the present application is not described herein again.
In a third aspect, the present application provides a method for recording access behavior of a sensitive resource, where the method includes: the first electronic device displays a sixth interface, and a fifth graph is displayed on the sixth interface, wherein the fifth graph is used for representing the number of times that one or more external devices access a first sensitive resource on the first electronic device; the first electronic device receives a tenth operation for the sixth interface; responding to the tenth operation, the first electronic device displays a fourth interface, and the fourth interface displays a plurality of behavior records of one or more external devices accessing the first sensitive resource on the first electronic device; wherein the behavioral record includes one or more of the following: the method comprises the steps of operating state of the first electronic device, names of one or more external devices, time of the one or more external devices accessing the first sensitive resource on the first electronic device, and number of times of the one or more external devices accessing the first sensitive resource on the first electronic device.
In this way, the first electronic device may also display only a plurality of behavior records of the one or more external devices accessing the one or more sensitive resources on the first electronic device.
Optionally, one or more possible implementation manners provided in the first aspect are also applicable to a method for recording a sensitive resource access behavior provided in the third aspect, and this embodiment of the present application is not described herein again.
In a fourth aspect, the present application provides an electronic device, which is a first electronic device, the first electronic device including: one or more processors, one or more memories; the one or more memories are coupled to the one or more processors, the one or more memories for storing computer program code comprising computer instructions that the one or more processors invoke to cause the first electronic device to perform a sensitive resource access behavior recording method provided in any of the possible implementations of the above aspect.
In a fifth aspect, the present application provides a computer readable storage medium for storing computer instructions that, when executed on a first electronic device, cause the first electronic device to perform a method of recording sensitive resource access behavior provided in any one of the possible implementations of the above aspect.
In a sixth aspect, the present application provides a computer program product for, when run on a first electronic device, causing the first electronic device to perform a method of recording sensitive resource access behavior provided in any one of the possible implementations of the above aspect.
For the beneficial effects of the second aspect to the sixth aspect, reference may be made to the description of the beneficial effects of the first aspect, and embodiments of the present application are not described herein.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application;
fig. 2 is a schematic software structure of an electronic device 100 according to an embodiment of the present application;
3A-3B are schematic diagrams showing a foreground running state of a set of application programs according to an embodiment of the present application;
fig. 3C-3G are a set of schematic diagrams showing a screen locking state of the electronic device 100 according to an embodiment of the present application;
fig. 4A-4E are schematic diagrams illustrating an electronic device 100 displaying an application behavior record according to an embodiment of the present application;
fig. 5A to 5H are schematic diagrams illustrating an electronic device 100 receiving a user operation stop display access record according to an embodiment of the present application;
FIGS. 6A-6C are schematic diagrams illustrating a detailed access record of each application program for a sensitive resource within a certain period of time according to an embodiment of the present application;
fig. 6D-6I are schematic diagrams illustrating screening and displaying a portion of a detailed access record by the electronic device 100 according to the embodiments of the present application;
FIGS. 6J-6P are schematic diagrams illustrating a detailed access record of an application program for a sensitive resource in a certain time according to an embodiment of the present application;
Fig. 7 is a schematic view of a scenario of multi-device interconnection provided in an embodiment of the present application;
fig. 8A-8G are schematic diagrams of an electronic device 100 according to an embodiment of the present application showing behavior records of other devices accessing sensitive resources on the electronic device 100;
fig. 9A-9C are schematic diagrams of an electronic device 100 according to an embodiment of the present application showing behavior records of other devices accessing sensitive resources on a local device;
FIGS. 10A-10B are diagrams of a UI provided in an embodiment of the present application;
FIG. 11 is a schematic diagram showing different icons and characters of privacy resources in a second language according to an embodiment of the present application;
fig. 12 is a flow chart of a method for recording access behavior of sensitive resources according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, "/" means or is meant unless otherwise indicated, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: the three cases where a exists alone, a and B exist together, and B exists alone, and in addition, in the description of the embodiments of the present application, "plural" means two or more than two.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The term "User Interface (UI)" in the following embodiments of the present application is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and an acceptable form of the user. A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be a visual interface element of text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc., displayed in a display of the electronic device.
The embodiment of the application provides a sensitive resource access behavior recording method. After the user has authorized the application to access one or more sensitive resources, the electronic device 100 may record the behavior of each application to access the sensitive resources. Among other things, the behavior of an application to access sensitive resources includes, but is not limited to: the type of sensitive resource accessed by the application, the time the sensitive resource is accessed by the application, the duration of the sensitive resource accessed by the application, the number of times the sensitive resource is accessed by the application, and the like.
Optionally, in the multi-device interconnection scenario, the electronic device 100 may record a record of the behavior of the electronic device 100 accessing the sensitive resource on the external device, and the electronic device 100 may also record a record of the behavior of the external device accessing the sensitive resource on the electronic device 100. The type of the external device is any one of the following: cell-phone, intelligent bracelet, car machine, bluetooth headset, intelligent large-size screen, intelligent wrist-watch.
Optionally, the types of sensitive resources include, but are not limited to: location information, cameras, microphones, telephones, address books, text messages, storage, calendars, memos, sports health data, photo albums, media and files, music, bluetooth, and the like.
The electronic device 100 may distinguish between the behavior of the application accessing the sensitive resource in each state based on the running state of the application and the running state of the electronic device 100. The running state of the application includes, but is not limited to: background running state and foreground running state. The operating state of the electronic device 100 is a screen-locked operating state and a non-screen-locked operating state. Based on the operating state of the application and the operating state of the electronic device 100, the operating state of the application may be further divided into: the system comprises a screen locking front stage operation state, a screen locking rear stage operation state, a non-screen locking front stage operation state and a non-screen locking rear stage operation state. Then, the electronic device 100 may record the behavior of the application to access the sensitive resource in the locked front-end running state, the locked back-end running state, the unlocked front-end running state, and the unlocked back-end running state, respectively.
In some embodiments, when the electronic device 100 is in the screen-locked state, the running state of the application program 1 in the screen-locked state of the electronic device 100 may be referred to as the screen-locked running state without distinguishing the foreground running state from the background running state. The running state of the application can be further divided into: the system comprises a screen locking running state, a non-screen locking foreground running state and a non-screen locking background running state. Then, the electronic device 100 may record the behavior of the application to access the sensitive resource in the locked screen running state, the non-locked screen foreground running state, and the non-locked screen background running state, respectively.
In other embodiments, the electronic device 100 may also record the behavior of the application to access the sensitive resource in the foreground and background operating states, respectively. In other embodiments, the electronic device 100 may also record the behavior of the application to access sensitive resources in the locked and unlocked operating states, respectively. The present application is not limited to a particular implementation.
To more visually demonstrate the behavior of each application accessing the sensitive resource in the respective operating states, the electronic device 100 may graphically demonstrate the behavior of each application accessing the sensitive resource in the respective operating states. Among these, the way of the graph includes, but is not limited to: bar graph, pie graph, ring graph, area graph, line graph, scatter graph, radar graph, bubble graph, etc.
For how the electronic device 100 displays the behavior of each application to access the sensitive resource in each of the above operation states, please refer to the description of the following contents, and the embodiments of the present application are not described herein in detail.
Fig. 1 shows a schematic configuration of an electronic device 100.
The electronic device 100 may be a cell phone, tablet, desktop, laptop, handheld, notebook, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook, as well as a cellular telephone, personal digital assistant (personal digital assistant, PDA), augmented reality (augmented reality, AR) device, virtual Reality (VR) device, artificial intelligence (artificial intelligence, AI) device, wearable device, vehicle-mounted device, smart home device, and/or smart city device, with the specific types of such electronic devices not being particularly limited in the embodiments of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM).
The external memory interface 120 may be used to connect external non-volatile memory to enable expansion of the memory capabilities of the electronic device 100. The external nonvolatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external nonvolatile memory.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card.
Fig. 2 schematically shows a software architecture of the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the invention, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, instant messaging, privacy center, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include system APIs, application behavior recording modules, behavior recording databases, application campaign frameworks, and the like.
When an application (e.g., application one) accesses a sensitive resource, e.g., application one accesses a sensitive resource such as a gallery, a location, etc., the application calls a system application programming interface (application programming interface, API) corresponding to the sensitive resource, and accesses the sensitive resource through a system API corresponding to the sensitive resource. For example, when an application program accesses a gallery application, the application program calls a gallery API corresponding to the gallery application and accesses the gallery application through the gallery API.
When the application program I calls the system API, the application behavior recording module records the behavior of the application program I calling the system API, and if the system API is the API corresponding to the sensitive resource, the application behavior recording module sends the behavior of the application program I calling the system API to the behavior recording module.
Meanwhile, the application behavior recording module also records the occurrence time and the ending time of the first calling system API of the application program.
The application activity framework records the running state of the first application and the running state of the electronic device 100 when the first application calls the system API. The running states of the application one may include a background running state and a foreground running state. The operating states of the electronic device 100 may include a locked screen operating state and an unlocked screen operating state.
The application behavior recording module calls the application activity framework and obtains the running state of the first application program when the first application program calls the system API.
And then, the application behavior recording module sends the information such as the occurrence time and the end time of the calling of the system API by the application program I, the running state of the application program I and the running state of the electronic equipment 100 to the behavior recording database when the application program I calls the system API.
The behavior record database records information such as the type of the application program accessing the sensitive resource in each running state, the time of the application program accessing the sensitive resource, the number of times of the application program accessing the sensitive resource, and the like, so that the behavior of the application program accessing the sensitive resource in each running state can be analyzed.
The behavior record database may record the behavior of other more applications accessing the sensitive resource under each running state, which is not limited in this embodiment.
When the privacy center needs to acquire the behavior of each application program for accessing the sensitive resource in each running state, and the behavior is displayed to a user for viewing through an interface, the privacy center can call the application behavior recording module, so that the application behavior recording module acquires the stored behavior of each application program for accessing the sensitive resource in each running state from the behavior recording database, and the application behavior recording module sends the behavior of each application program for accessing the sensitive resource in each running state to the privacy center. When the privacy center obtains the behavior of each application program for accessing the sensitive resource in each running state, the electronic device 100 can display the behavior of each application for accessing the sensitive resource in each running state in a graph mode, so that the behavior of each application for accessing the sensitive resource in each running state is displayed more intuitively.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
First, several running states of the application program and several running states of the electronic device 100 are explained.
1. Foreground running state of the application.
The foreground running state refers to the application one having been started, and the electronic device 100 currently displays a user interface within the application one.
Taking an application program I as an instant messaging application as an example, what is the foreground running state is described.
Fig. 3A illustrates a main interface of the electronic device 100.
The main interface of the electronic device 100 includes icons of a plurality of application programsSuch as file management application icons, email application icons, music application icons, instant messaging application icons, sports health applicationsAn icon of a weather application, an icon of a camera application, an icon of an address book application, an icon of a telephone application, an icon of an information application, and the like. The main interface of the electronic device 100 also includes calendar indicators (e.g., current date is 6 months 21 days, friday), time indicators (e.g., time 08:08), weather indicators (e.g., shenzhen weather is cloudy, temperature is 6 ℃), power indicators and signal indicators, etc.
As shown in fig. 3A, the electronic device 100 receives an input operation (e.g., a click) of an icon of the instant messaging application by a user, and in response to the input operation by the user, the electronic device 100 may display a main interface of the instant messaging application as shown in fig. 3B.
As shown in fig. 3B, the host interface of the instant messaging application includes a display area for one or more contacts. Such as the display area of contact "anywhere", the display area of contact "file transfer assistant", the display area of contact "Henry", the display area of contact "Lucy", the display area of contact "David".
When the electronic device 100 displays the main interface of the electronic device 100, the instant messaging application may be considered as a foreground running state at this time. Not only the electronic device 100 displays the main interface of the electronic device 100, but also when the electronic device 100 displays any other interface of the instant messaging application, the running state of the instant messaging application may be considered as the foreground running state.
2. Background running state of the application.
The background running state may refer to the application one having been opened, but the electronic device 100 does not display a user interface within the application one.
In some embodiments, the background running state may also mean that the application program one is not started, and the electronic device 100 does not display a user interface in the application program one, but the application program one may acquire operations such as notification of a new network message, regular reminding, and the like, and display information such as notification of the new message, regular reminding, and the like in the drop-down notification bar.
Taking an application program I as an instant messaging application as an example, what is the background running state is described.
After the electronic device 100 displays the main interface of the instant messaging application as shown in fig. 3B, the electronic device 100 receives the operation of the user (for example, the trigger operation for the return key), and the electronic device 100 does not display the main interface of the instant messaging application as shown in fig. 3B any more, but displays the main interface of the electronic device 100 as shown in fig. 3A, where the running state of the instant messaging application is the background running state.
3. A lock screen state of the electronic device 100.
The screen locking state refers to that the electronic device 100 does not receive the input operation of the user within a certain period of time, and the electronic device 100 automatically stops screen locking, and the electronic device 100 is in the screen locking state.
On the other hand, in the non-screen-locked state of the electronic device 100, as shown in fig. 3C, the electronic device 100 may receive an input operation (e.g., a pressing operation) of the user on the power key, and in response to the input operation of the user, as shown in fig. 3D, the electronic device 100 is off-screen, and at this time, the electronic device 100 is in the screen-locked state.
In other aspects, the user may also enter the lock state by voice controlling the electronic device 100. For example, the user outputs the voice "small E, enter the lock screen state". After the electronic device 100 recognizes the voice, the electronic device 100 will be controlled to enter a lock screen state.
In some embodiments, when the electronic device 100 enters the screen locking state and in the screen-off state, if the electronic device 100 receives an input operation (for example, a pressing operation) of a user on a power key, or the electronic device 100 receives an input operation (for example, a double click operation) of a user on a display screen, the electronic device 100 will be on screen in response to the input operation of the user, but at this time, the electronic device 100 is still in the screen locking state.
As shown in fig. 3E, the electronic device 100 receives the operation of the user in the screen-locked state, and after the electronic device 100 is turned on, only part of the information is displayed on the electronic device 100, that is, more information is displayed when the electronic device 100 is turned on in the non-screen-locked state than when the electronic device 100 is turned on in the screen-locked state.
When the electronic device 100 is on in the screen-locked state, the electronic device 100 may display a date, a time, a flashlight icon, a camera application icon, and the like.
As shown in fig. 3E, the electronic device 100 receives an upward sliding operation of a user on the display screen, and in response to the upward sliding operation, if the electronic device 100 is previously provided with an unlocking procedure, the electronic device 100 may display a verification interface, and if the electronic device 100 is not previously provided with the unlocking procedure, the electronic device 100 will execute the unlocking operation, so that the electronic device 100 enters a non-screen locking state.
In the case where the electronic apparatus is provided with the unlocking flow, in response to the upward sliding operation, the electronic apparatus 100 may display a user interface for prompting the user to verify the face image information to perform the unlocking screen operation, as shown in fig. 3F. If the facial image information passes the verification, the electronic device 100 will perform the screen unlocking operation, so that the electronic device 100 enters the non-screen locking state. If no facial image information is recognized or the facial image information fails to be verified for a certain period of time, the electronic device 100 will display a user interface as shown in fig. 3G for prompting the user to input a password. In the case that the user password input is correct, the electronic device 100 will perform an unlocking operation, so that the electronic device 100 enters a non-screen locking state.
Alternatively, if the electronic device 100 does not set the face verification unlock procedure, the electronic device 100 may directly display the user interface shown in fig. 3G without displaying the user interface shown in fig. 3F in response to the slide-up operation. Optionally, the electronic device 100 may also perform the unlocking screen operation through fingerprint information or voiceprint information.
4. A non-screen-locked state of the electronic device 100.
As shown in fig. 3F and 3G described above, in the case where the face image information is verified to be correct by and/or the password is input, the electronic apparatus 100 will perform the screen unlocking operation, so that the electronic apparatus 100 enters the non-screen locking state.
In one possible implementation, if the electronic device 100 is displaying the main interface of the electronic device 100, the electronic device 100 receives an operation by a user to cause the electronic device 100 to enter a screen locking state, and if the facial image information is verified to pass and/or the password is input correctly, the electronic device 100 will execute the screen unlocking operation, the electronic device 100 enters a non-screen locking state, and the electronic device 100 may display the main interface of the electronic device 100 as shown in fig. 3A.
In other possible implementations, if the electronic device 100 is displaying the user interface in the application program one, the electronic device 100 receives the operation of the user to cause the electronic device 100 to enter the screen locking state, and if the facial image information is verified to pass and/or the password is input correctly, the electronic device 100 will execute the screen unlocking operation, and the electronic device 100 enters the non-screen locking state, and the electronic device 100 may display the user interface in the application program one, for example, display the main interface of the instant messaging application shown in fig. 3B.
In combination with the foreground running state and the background running state of the application program, and the screen locking state and the non-screen locking state of the electronic device 100, the running state of the application program can be divided into four types, which are respectively: the screen locking front stage running state, the screen locking back stage running state, the non-screen locking front stage running state and the non-screen locking back stage running state.
The concepts of the front-stage operation state of the screen locking, the back-stage operation state of the screen locking, the front-stage operation state of the screen unlocking and the back-stage operation state of the screen unlocking are respectively explained.
1. And (5) locking the operation state of the front stage of the screen.
The screen-locked foreground running state refers to that the running state of the application program one is the foreground running state, and then the electronic device 100 enters the screen-locked state. The running state of application one may be considered to be the lock screen foreground running state a certain time (e.g., 10 s) after the electronic device 100 enters the lock screen state.
For example, when the first application is an instant messaging application, when the first application is running in the foreground (i.e. the user interface of the first application is displayed on the electronic device 100), the user communicates with other users through the first application, if the user does not operate the electronic device 100 for a long time, or the electronic device 100 receives an operation of the user to press a power key to enable the electronic device 100 to stop the screen, and within 10s after the electronic device 100 stops the screen, the running state of the first application is the screen locking foreground running state. At this time, the user can still talk with other users through the application program.
2. And (5) locking a screen background running state.
The screen-locked background operation state refers to that the operation state of the first application is the background operation state, and then the electronic device 100 enters the screen-locked state, where the operation state of the first application can be considered as the screen-locked background operation state. Or, the running state of the first application is the foreground running state, then the electronic device 100 enters the screen locking state, after a certain time (for example, 10 s) after the electronic device 100 enters the screen locking state, the electronic device 100 modifies the running state of the first application from the foreground running state to the background running state, and at this time, the running state of the first application can be considered to be the screen locking background running state.
For example, when the application program one is an instant messaging application, when the application program one is running in the background (i.e. the electronic device 100 does not display a user interface of the application program one), the user communicates with other users through the application program one, if the user does not operate the electronic device 100 for a long time, or the electronic device 100 receives an operation of the user to press a power key to enable the electronic device 100 to stop the screen, and at this time, the running state of the application program one is a screen locking background running state. At this time, the user can still talk with other users through the application program.
Or, when the application program one runs in the foreground (i.e. the user interface with the application program one displayed by the electronic device 100), the user communicates with other users through the application program one, if the user does not operate the electronic device 100 for a long time, or the electronic device 100 receives the operation of the user to press the power key to enable the electronic device 100 to stop the screen, after 10s after the electronic device 100 stops the screen, the running state of the application program one is the screen locking background running state.
In some embodiments, when the electronic device 100 is in the screen-locked state, the running state of the application program 1 in the screen-locked state of the electronic device 100 may be referred to as the screen-locked running state without distinguishing the foreground running state from the background running state.
3. And the front stage of the non-lock screen is in an operating state.
The non-screen-locked foreground operation state refers to that when the electronic device 100 is in the non-screen-locked state, the operation state of the first application is the foreground operation state.
For example, when the application one is an instant messaging application, the user communicates with other users through the application one when the application one is running in the foreground (i.e., the user interface of the electronic device 100 is displayed with the application one), and when the electronic device 100 is in the non-screen-locked state.
4. And (5) a background running state without screen locking.
The non-screen-locked background running state refers to that when the electronic device 100 is in the non-screen-locked state, the running state of the first application is the background running state.
For example, when the application one is an instant messaging application, the user communicates with other users through the application one when the application one is running in the foreground (i.e., the electronic device 100 does not display the user interface of the application one), and when the electronic device 100 is in the non-screen-locked state.
According to the application behavior recording method provided by the embodiment of the application, the electronic device 100 can record the behaviors of the sensitive resources accessed by each application program in different running states and display the behaviors to a user for viewing. So that the user knows the behavior of the individual applications.
The following embodiments are described in the application program accessing the sensitive resource in the screen-locking foreground running state, the screen-locking background running state, the non-screen-locking foreground running state and the non-screen-locking background running state, respectively, but should not be limited, and the application program accessing the sensitive resource may also be recorded based on other running states.
Alternatively, the electronic device 100 may display the behavior of applications installed on the electronic device 100 to access the various sensitive resources in the form of a bar graph. Specifically, the histogram may show N applications with the highest number of accesses to each sensitive resource, where each histogram of sensitive resources corresponds to a color. Each column in the bar graph represents one application. The height of the first column in the bar graph can be adjusted according to the ranking of the corresponding access times of the corresponding application, and the heights of the remaining columns in the bar graph are adjusted according to the access times and the maximum times proportion, and are sequentially arranged according to the access times. The number of visits is presented to the user with the ratio of the heights of the columns, while the number of visits is presented above the columns. The head of the column is oval, if the application accessing a certain sensitive resource is too few, the left blank is replaced by a column with fixed height, and the number of times of access and the application icon are not displayed.
Alternatively, the histograms may be arranged in order from a small number to a large number of accesses.
Fig. 4A-4E are schematic diagrams illustrating an electronic device 100 displaying an application behavior record according to an embodiment of the present application.
The application interfaces presented in fig. 4A-4E are for illustrative purposes only, and the style and content of the application interface presented to the user by the electronic device 100 in a real-world environment may vary.
As shown in fig. 4A, the electronic device 100 displays a main interface of the setting application. The main interface of the setup application includes a plurality of setup options, such as a flight mode option, wi-Fi option, bluetooth option, personal hot spot option, mobile network option, do not disturb mode option, display and brightness option, warrior account option, privacy protection option, and the like. The electronic device 100 may receive the up-down sliding operation of the user on the main interface of the setting application, so that the electronic device 100 may display other more setting options, which are not described herein.
As shown in fig. 4A, the electronic device 100 receives an input operation (e.g., a click) of a privacy protection option by a user, and in response to the input operation by the user, the electronic device 100 may display a user interface 410 as shown in fig. 4B.
The user interface 410 includes, among other things, a tab set 401 and an access record histogram 402. Wherein the tag group 401 comprises icons of a plurality of sensitive resources, such as icons of location information, icons of cameras, icons of media and files, icons of contacts, icons of microphones, etc. Sensitive resources may also include other more types of resources, which are not limited by embodiments of the present application.
In fig. 4B, the electronic device 100 shows a bar graph of the access location information of each application within 7 days. If the application installed locally on the electronic device 100 has not accessed the location information within 7 days, the access record bar 402 may not display a bar.
Also displayed in user interface 410 are controls 403 and 404. Wherein the electronic device 100 may receive an input operation (e.g., a click) by a user for the control 403, the electronic device 100 may not display a behavior record of an application locally installed by the electronic device 100 in response to the input operation by the user. The electronic device 100 may also receive an input operation (e.g., a click) of the control 404 by the user, and in response to the input operation by the user, the electronic device 100 may display a detailed record of the behavior record of the application program locally installed on the electronic device 100, which is not described herein in detail in the embodiments of the present application.
The electronic device 100 may also receive a bar graph that the user operates to switch display the various applications accessing other sensitive resources within 7 days.
By way of example, the other sensitive resource may be a camera.
For example, as shown in fig. 4B, the electronic device 100 may receive an input operation (e.g., a click) by a user on an icon of a camera in the tag group 401, and in response to the input operation by the user, the electronic device 100 may display a user interface 420 as shown in fig. 4C.
User interface 420 displays a bar graph of behavior records of applications installed on electronic device 100 accessing a camera within 7 days. In fig. 4C, the application program accessing the camera includes an instant messaging application, which accesses the camera 109 times in 7 days. The application program for accessing the camera also comprises a short message application, and the number of times the short message application accesses the camera in 7 days is 40. The application program that accessed the camera also included a video application that accessed the camera 1 time in 7 days.
Alternatively, the electronic device 100 may not display the number of accesses on the histogram. When the electronic apparatus 100 receives an operation (e.g., a click operation) by the user with respect to the histogram, the electronic apparatus 100 displays the number of accesses.
As another example, as shown in fig. 4D, the electronic device 100 may receive an input operation (e.g., a single click) by a user on icons of media and files in the tag group 401, and in response to the input operation by the user, the electronic device 100 may display a user interface 430 as shown in fig. 4E.
User interface 430 shows a bar graph of behavior records for applications installed on electronic device 100 to access media and files within 7 days. In fig. 4E, the application program accessing the media and files includes an instant messaging application that accesses the media and files 150 times in 7 days. The application program for accessing the media and the files also comprises a short message application, and the number of times of accessing the media and the files by the short message application is 70 in 7 days. Applications accessing media and files also include sports health applications that access media and files 1 time in 7 days.
As shown in fig. 4E and 4C, the application program installed on the electronic device 100 accesses the camera behavior record bar graph and the access media and file behavior record bar graph are different in color, so as to distinguish the behavior of the application program accessing different sensitive resources.
In some embodiments, the user may choose to cause the electronic device 100 to cease displaying records of applications accessing various sensitive behaviors. To prevent other users from looking into the behavior of users using electronic device 100 using various applications.
In some embodiments, the user may also choose to close the access record of the application for a portion of the privacy resource.
In some embodiments, the user may also choose to close access records for the portion of the application for the private resource.
Fig. 5A to 5H schematically illustrate diagrams of the electronic device 100 receiving a user operation to stop displaying access records.
As shown in fig. 5A, electronic device 100 receives an input operation (e.g., a single click) by a user for control 403, and in response to the input operation by the user, electronic device 100 may display user interface 510 as described in fig. 5B.
As shown in fig. 5B, the electronic device 100 no longer displays the tag group 401 and the access record histogram 402. The electronic device 100 may display a prompt message, where the content of the prompt message may be "show application privacy analysis report, help you know the application behavior and its data access condition in time", and the prompt message is used to prompt the user to start the application behavior record. The electronic device 100 may also display a control 501. The electronic device 100 may receive an input operation (e.g., a single click) by the user for the control 501, and in response to the input operation by the user, the electronic device 100 may open an application delivery behavior record, i.e., the electronic device 100 may display a user interface 420 as shown in fig. 5A.
Optionally, in some embodiments, after the electronic device 100 receives the input operation of the user for the control 403, the electronic device 100 may delete the access record of each application program for the privacy resource. After the electronic device 100 receives the input operation of the control 403 by the user, the electronic device 100 receives the input operation of the control 501 by the user (e.g., clicking), and in response to the input operation by the user, the electronic device 100 may display a user interface 410 as shown in fig. 4B.
In some embodiments, the user may also choose to close the access record of the application for a portion of the privacy resource.
For example, the user may choose to close an access record of the application for the camera privacy resource.
As shown in fig. 5C, the electronic device 100 receives an input operation (e.g., double click) of the user with respect to the camera icon, and in response to the input operation of the user, the electronic device 100 may display a prompt 502 as shown in fig. 5D, where the prompt 502 is used to prompt the user whether to display an access record of the application program with respect to the camera privacy resource.
As shown in fig. 5D, the electronic device 100 receives an input operation (e.g., a click) of the prompt 502 by the user, and in response to the input operation by the user, the electronic device 100 displays a user interface 520 as shown in fig. 5E. User interface 520 differs from user interface 510 in that in user interface 520, tab set 401 does not include an icon for a camera. As such, the electronic device 100 may not expose an access record for the application for the camera privacy resource.
Optionally, if the user needs to display the access record of the application program for the camera privacy resource, the electronic device 100 may receive the input operation (such as clicking) of the user for the more controls 503 in the tag group 401, and in response to the input operation of the user, the electronic device 100 may open the access record of the application program for the camera privacy resource in the newly displayed user interface, which is not described herein in detail.
In some embodiments, the user may also choose to close access records for the portion of the application for the private resource.
For example, the user may choose to close an access record for the privacy resource for the instant messaging application.
As shown in fig. 5F, the electronic device 100 receives an input operation (e.g., double click) of the user for the instant messaging application icon, and in response to the input operation of the user, the electronic device 100 may display a prompt 504 as shown in fig. 5G, where the prompt 504 is used to prompt the user whether to display an access record of the instant messaging application for each privacy resource.
As shown in fig. 5G, the electronic device 100 receives an input operation (e.g., a click) of the prompt 504 by the user, and in response to the input operation by the user, the electronic device 100 displays a user interface 530 as shown in fig. 5H. The user interface 530 differs from the user interface 420 in that in the user interface 530, the access record bar graph 402 does not include a bar of the number of accesses by the instant messaging application to the camera. In this way, the electronic device 100 may not expose access records for the instant messaging application for the respective privacy resources.
Optionally, if the user needs to display the access records of the instant messaging application for each privacy resource, the electronic device 100 may receive an input operation (for example, a single click) of the user for more controls 503 in the tag group 401, and in response to the input operation of the user, the electronic device 100 may open the access records of the instant messaging application for each privacy resource in a newly displayed user interface, which is not described herein in detail.
In some embodiments, electronic device 100 may receive a user operation such that electronic device 100 may display, for each sensitive resource, a detail access record for that sensitive resource for a period of time by the respective application, including but not limited to: access time, access times, application running status of the application, etc.
Fig. 6A-6C are schematic diagrams illustrating detailed access records of various applications for a sensitive resource over a period of time. Fig. 6A-6C are also merely illustrative of the present application, and the user interface may vary in practice.
The one sensitive resource may be, for example, a camera sensitive resource.
As shown in fig. 6A, the electronic device 100 receives an input operation (e.g., a single click) by a user for the control 404, and in response to the input operation by the user, the electronic device 100 may display a user interface 610 as shown in fig. 6B. User interface 610 shows a detailed access record for each application for camera sensitive resources.
As shown in fig. 6B, user interface 610 shows a detailed access record for a camera for different applications. The detail access record includes: today 11: and 25, the video is in a non-screen-locking foreground running state, and the times of accessing the camera by the video are 3 times. Today 09:03, the instant messaging application is in a locked background running state, and the number of times the instant messaging application accesses the camera is 1. Today 09:02, the instant messaging application is in a non-screen-locked foreground running state, and the number of times the instant messaging application accesses the camera is 6. 20:49 yesterday, hua is that the video is in a non-screen-locking foreground running state, and Hua is that the times of the video accessing the camera are not displayed.
It should be noted that, each detail access record may be understood as the number of times of accessing the corresponding sensitive resource in a certain time (for example, in one minute), and the time of accessing the record may be accurate to one minute. A detail access record "today 11" such as shown in fig. 6B: 25, the video is in the non-screen-locking foreground running state, the number of times of the video accessing the camera is 3 ", which can be understood as that in the 25 th minute, the video is in the non-screen-locking foreground running state, the number of times of the video accessing the camera is 3, for example, in 11:25:01 (i.e., 25 minutes 1 second at 11), the first time the video accesses the camera, at 11:25:30 (i.e., 25 minutes 30 seconds at 11), the video accesses the camera for the second time, at 11:25:50 (i.e., 11 hours 25 minutes 50 seconds), is shown as the third time the video accesses the camera.
Not only to minutes, in order to reduce the amount of information displayed on the interface of the electronic device 100, the electronic device 100 may display the number of times of accessing the corresponding sensitive resource for a longer time (e.g., within one hour), the time of accessing the record may be accurate to the hour, for example, the access record may be "today 11 times, the number of times of video access to the camera is 3 times, and it may be understood that in the hour 11 a morning, the number of times of video access to the camera is 3 times, and the number of times of video access to the camera is 3 times, for example, at 11:25 (i.e., 25 minutes at 11), the first time the video accesses the camera, at 11:30 (i.e., 30 minutes at 11), the video accesses the camera for the second time, at 11:50 (i.e., 11 hours 50 minutes), the third time the video accesses the camera. In this way, the application program can be prevented from accessing the sensitive resource for multiple times, so that the electronic device 100 needs to record multiple access records respectively, and redundancy of displaying the access records by the electronic device 100 is reduced.
Alternatively, to reduce the amount of information displayed on the interface of the electronic device 100, the electronic device 100 may display the number of times the corresponding sensitive resource is accessed during the time period. The time period may be 10 minutes, or may be other time periods, which are not limited in this embodiment of the present application. As shown in fig. 6C, user interface 620 illustrates a detailed access record for a camera for another different application. The detail access record includes: today 11: and (25-11:35), wherein the video is in a non-screen-locking foreground running state, and the video is accessed to the camera for 10 times. Today 10:20-10:30, the instant messaging application is in a screen locking background running state, and the number of times that the instant messaging application accesses the camera is 6. Today 09:02-09:12, the instant messaging application is in a non-screen-locked foreground running state, and the number of times the instant messaging application accesses the camera is 16. Yesterday 20:49-20:59, and the video is in an unlocked front running state, and the times of video access to the camera are not shown. In this way, based on the time period, the access records of the application program for the sensitive resource can be displayed by combining a plurality of access records in one record, and the corresponding access times are increased, so that the redundancy of displaying the access records by the electronic device 100 is reduced.
The electronic device 100 may receive a user up-down sliding operation for the user interface 610, causing the electronic device 100 to display more application-specific camera detail access records.
In some embodiments, the electronic device 100 may filter a portion of the detailed access record according to information such as a type, time, access times, and running state of the application. The electronic device 100 may also screen to obtain a part of the detailed access record in other manners, which is not limited in the embodiment of the present application.
Fig. 6D-6I schematically show a schematic diagram of the electronic device 100 screening and displaying part of a detailed access record. Fig. 6D-6I are also merely illustrative of the present application, and the user interface may vary in practice.
As shown in fig. 6D, electronic device 100 receives an input operation (e.g., a single click) by a user for filter control 601, and in response to the input operation by the user, electronic device 100 may display prompt 602 as shown in fig. 6E. The prompt 602 is used to prompt the user to select rules for screening. Illustratively, the prompt 602 includes a filter time option and a filter run status option. The electronic device 100 may receive an input operation of the user for a filtering time option, so that the electronic device 100 may filter a part of the detail access record based on a time (for example, today) selected by the user, and display the part of the detail access record. The electronic device 100 may also receive an input operation of the user for a filtering operation status option, so that the electronic device 100 may filter a portion of the detail access record based on the operation status of the application program selected by the user (for example, the non-screen-locked foreground operation status), and display the portion of the detail access record.
For example, as shown in fig. 6E, the electronic device 100 may receive an input operation (e.g., a click) of the user for screening the operation state options, and in response to the input operation of the user, the electronic device 100 may display the prompt 603 as described in fig. 6F. The prompt 603 is used to prompt the user to select an operation state of the application, and the operation states of the application shown in the prompt 603 include a screen-locking foreground operation state, a screen-locking background operation state, a non-screen-locking foreground operation state, and a non-screen-locking background operation state.
Optionally, the electronic device 100 may receive an input operation (e.g., a click) of the user for screening the operation status options, and in response to the input operation of the user, the electronic device 100 may also display the prompt 604 as described in fig. 6G. The prompt 604 is used to prompt the user to select the running state of the application, and the running states shown in the prompt 604 include, for example, a locked screen state, an unlocked screen running state, a foreground running state, and a background running state.
The following embodiments of the present application will be described with reference to the running state of the application program shown in the prompt 603.
As shown in fig. 6H, the electronic device 100 receives a user selection operation for the non-lock foreground running state option in the prompt 603, and the electronic device 100 will display the user interface 630 shown in fig. 6I. Wherein the user interface 630 shows a detail access record for the camera for each application in the non-lock foreground running state. For example, today 11:25, how is video in the non-lock foreground running state, the number of times the camera is accessed is 3. Today 09:02, the number of times the instant messaging application accesses the camera is 6 in the non-lock front running state. 20:49 yesterday, and the number of times of accessing the camera is 10 times when the video is in the non-screen-locking foreground running state. And (3) the number of times of accessing the camera is 30 when the instant messaging application is in the non-screen-locking foreground running state 19:02 yesterday.
The electronic device 100 may receive a user up-down sliding operation with respect to the user interface 630, and in response to the user sliding operation, the electronic device 100 may display the number of times and time that more other applications access the camera in the non-lock foreground operating state. The embodiments of the present application are not described herein.
In some embodiments, electronic device 100 may receive a user operation such that electronic device 100 may display a detailed access record for a sensitive resource for an application over a period of time, including but not limited to: access time, access times, application running status of the application, etc.
Fig. 6J-6P illustrate diagrams of detailed access records for an application for a sensitive resource over a period of time. Fig. 6J-6P are also merely illustrative of the present application, and the user interface may vary in practice.
By way of example, a certain application may be an instant messaging application and a certain sensitive resource may be a camera sensitive resource.
As shown in fig. 6J, the electronic device 100 receives an input operation (e.g., a click) of the instant messaging application icon by a user, and in response to the input operation by the user, the electronic device 100 may display the user interface 640 as described in fig. 6K. User interface 640 shows a detailed access record for an instant messaging application for camera sensitive resources. The detail access record includes: today 09:03, when the instant messaging application is in the screen locking background running state, the number of times that the instant messaging application accesses the camera is 1. Today 09:02, the number of times that the instant messaging application accesses the camera is 6 under the running state of the non-screen-locking foreground. Today 07:30, when the instant messaging application is in the screen locking background running state, the number of times that the instant messaging application accesses the camera is 5. Yesterday 20:49, the number of times the instant messaging application accesses the camera is not shown in the background running state of the screen locking.
The electronic device 100 may receive a user up-down sliding operation on the user interface 640, such that the electronic device 100 displays more detail access records for the camera for the instant messaging application. The embodiments of the present application are not described herein.
Also included in the user interface 640 is a filter control 605, which the electronic device 100 may receive a user input operation (e.g., a single click) to the filter control 605, and in response to the user input operation, the electronic device 100 may filter and display a portion of the detail access record.
As shown in fig. 6K, the electronic device 100 receives an input operation (e.g., a single click) by a user for the filter control 605, and in response to the input operation by the user, the electronic device 100 may display the prompt 602 as shown in fig. 6L. For the description of the prompt message 602, reference may be made to the description in fig. 6E, and the embodiments of the present application will not be repeated herein.
As shown in fig. 6L, the electronic device 100 receives an input operation (e.g., a single click) of the user for selecting an operation state option in the prompt 602, and in response to the input operation of the user, the electronic device 100 may display the prompt 606 as described in fig. 6M. The prompt 606 is used to prompt the user to select the running state of the application, and the running states of the application shown in the prompt 606 include, for example, a screen-locked foreground running state, a screen-locked background running state, a non-screen-locked foreground running state, and a non-screen-locked background running state.
Optionally, the electronic device 100 may receive an input operation (e.g., a click) of the user for selecting an operation status option in the prompt 602, and in response to the input operation of the user, the electronic device 100 may also display the prompt 607 as described in fig. 6N. The prompt 607 is used to prompt the user to select the running state of the application, and the running states shown in the prompt 607 include, for example, a screen-locked state, a non-screen-locked running state, a foreground running state, and a background running state.
The following embodiments of the present application will be described with reference to the running state of the application program shown in the hint information 602.
As shown in fig. 6O, the electronic device 100 receives a user selection operation for the lock screen background running status option in the prompt 606, and the electronic device 100 will display the user interface 650 as shown in fig. 6P. Wherein the user interface 650 shows a detail access record for the camera in the lock screen background running state of the instant messaging application. For example, today 09:03, an instant messaging application accesses a camera 1 time in a lock screen background running state. Today 07:30, the number of times the instant messaging application accesses the camera is 5 in the lock screen background running state. And (4) 20:49 yesterday, wherein the number of times of accessing the camera is 5 when the instant messaging application is in the screen locking background running state. The number of times of accessing the camera is 6 times when the instant messaging application is in the screen locking background running state, and the number of times of accessing the camera is 16:30 yesterday.
The electronic device 100 may receive a user up-down sliding operation with respect to the user interface 650, and in response to the user sliding operation, the electronic device 100 may display the number of times the instant messaging application accesses the camera in the screen-locked background operation state. The embodiments of the present application are not described herein.
The foregoing embodiments introduce a record of behavior of an application installed locally on the electronic device 100 to access sensitive resources on the local device.
Optionally, in some implementations, in a multi-device interconnection scenario, the electronic device 100 may also record a record of the behavior of other devices accessing sensitive resources on the local device, or the electronic device 100 may record a record of the behavior of the electronic device 100 accessing sensitive resources on other devices.
Fig. 7 illustrates a schematic diagram of a scenario of a multi-device interconnect.
As shown in fig. 7, the electronic device 100 may establish a connection with one or more devices. The device type of the one or more devices may be various types, and the specific type of the plurality of electronic devices is not particularly limited in the embodiments of the present application. For example, the one or more devices may include a tablet, desktop, laptop, handheld, notebook, smart screen, wearable device, augmented reality (augmented reality, AR) device, virtual Reality (VR) device, artificial intelligence (artificial intelligence, AI) device, car set, smart headset, gaming machine, and may also include an internet of things (internet of things, IOT) device or smart home device such as a smart water heater, smart light fixture, smart air conditioner, and the like. Without limitation, the plurality of devices in system 400 may also include non-portable terminal devices such as laptop computers (labtop) having a touch-sensitive surface or touch panel, desktop computers having a touch-sensitive surface or touch panel, and the like.
The multiple devices may be configured with different software Operating Systems (OSs), including but not limited toEtc. Wherein (1)>Is a hong Mongolian system.
The electronic devices may also all be configured with the same software operating system, e.g., may all be configured with
The electronic device 100 may establish a connection with one or more devices in any of the following ways.
Mode one: the electronic device 100 and one or more devices may be connected to the same network, e.g., the electronic device 100 and one or more devices may be connected to the same local area network, establishing a coordinated connection.
Mode two: the electronic device 100 and one or more devices may also log into the same system account to establish a collaborative connection. For example, the system account number for which electronic device 100 logs in with one or more devices may be "HW1234".
Mode three: the electronic device 100 may all belong to the same account group as the system account number registered on one or more devices. For example, the system account number registered on the electronic device 100 and one or more devices includes "HW001", "HW002", "HW003". The system accounts "HW001", "HW002", "HW003" belong to the account group "Huazhi".
Mode four: the electronic device 100 and one or more devices may establish a connection by near field communication (Near Field Communication, NFC), bluetooth (BT), wireless local area network (wireless local area networks, WLAN), such as wireless fidelity point-to-point (wireless fidelity point to point, wi-Fi P2P), infrared (IR), and the like.
Mode five: the electronic device 100 and one or more devices may establish a temporary account group by scanning the same two-dimensional code, and establish a cooperative connection to implement communication.
Without being limited to the above five manners, the electronic device 100 may also establish a cooperative connection with one or more devices through other manners, which is not limited in the embodiments of the present application.
In addition, electronic device 100 and one or more devices may also be connected and communicate in any of the several manners described above, which is not limited by embodiments of the present application.
Illustratively, the electronic device 100 establishes a connection with the vehicle. The electronic device 100 may access information on board a vehicle, such as sensitive resources like contacts, music, phones, calendars, etc. For example, after the electronic device 100 establishes a bluetooth connection with the car machine, the electronic device 100 may answer a phone call on the car machine, and at this time, the electronic device 100 may access sensitive information such as the phone call on the car machine.
Illustratively, the electronic device 100 establishes a connection with the vehicle. The car set may also access information on the electronic device 100, such as location information, music, contacts, and other sensitive information. For example, after the electronic device 100 establishes a bluetooth connection with the car machine, the car machine may play music on the electronic device 100, and at this time, the electronic device 100 may access sensitive information such as music on the car machine.
Based on this, in the scenario of multi-device interconnection, if the device can record the behavior record of the local device accessing the sensitive resource on the other device, or record the behavior record of the other device accessing the sensitive resource on the local device, the user can know the usage rights of the electronic device 100.
8A-8G illustrate diagrams of electronic device 100 displaying behavior records of other devices accessing sensitive resources on electronic device 100. Fig. 8A-8G are also merely illustrative of the present application, and the user interface may vary in practice.
As shown in fig. 8A, the electronic apparatus 100 receives a sliding operation of a user acting in a first direction (e.g., to the left) on the access record histogram display area, and in response to the sliding operation of the user, the electronic apparatus 100 displays a user interface 810 as shown in fig. 8B. User interface 810 differs from user interface 410 in that user interface 810 records a histogram of access records of other devices accessing sensitive resources locally on electronic device 100, and user interface 410 records a histogram of access records of applications installed locally on electronic device 100 accessing sensitive resources locally.
Optionally, in some embodiments, the electronic device 100 may also receive an input operation (e.g., a click) by the user for the privacy preserving option in fig. 4A, and in response to the input operation by the user, the electronic device 100 may display a user interface 810 as shown in fig. 8B. That is, the electronic device 100 may also only display a schematic of the electronic device 100 displaying a record of the behavior of other devices accessing sensitive resources on the electronic device 100.
As shown in fig. 8B, the user interface 810 includes a tab set and an access record bar graph. The tab set in the user interface 810 is similar to the tab set 401, and the embodiments of the present application will not be described herein. The user interface 810 also includes an access log bar chart that illustrates how many times other devices access sensitive resources locally on the electronic device 100, such as 66 times the vehicle accesses location information on the electronic device 100 over a period of time (e.g., 7 days) in fig. 8B. The smart band accesses the location information on the electronic device 100 25 times over a period of time (e.g., 7 days). The electronic device 200 accesses location information on the electronic device 100 5 times over a period of time (e.g., 7 days). The electronic device 200 is a mobile phone, and the electronic device 200 is different from the electronic device 100.
The user interface 810 also includes controls 801 and 802. Wherein the electronic device 100 may receive an input operation (e.g., a single click) by a user for the control 801, the electronic device 100 may not display a record of behavior of other devices accessing sensitive resources on the electronic device 100 in response to the input operation by the user. The electronic device 100 may also receive an input operation (e.g. a click) of the control 802 by the user, and in response to the input operation by the user, the electronic device 100 may display a detailed record of accessing the sensitive resource on the electronic device 100 by other devices, which is not described herein in detail in the embodiments of the present application.
The electronic device 100 may also receive an access record bar graph that the user operates to switch display that other devices access other sensitive resources (e.g., cameras) on the electronic device 100 within 7 days. The embodiments of the present application are not described herein.
In some embodiments, the user may choose to cause electronic device 100 to cease displaying records of other devices accessing the respective sensitive behaviors.
In some embodiments, the user may also choose to cause electronic device 100 to cease displaying access records for the portion of the private resource by other devices. For example, the user may choose to close access records by other devices for camera privacy resources on electronic device 100.
In some embodiments, the user may also choose to cause electronic device 100 to stop displaying portions of the access records for the private resource by other devices. For example, the user may choose to shut down access records of the vehicle for private resources on the electronic device 100.
Specific operations may be referred to the descriptions in fig. 5A to 5H, and the embodiments of the present application are not repeated here.
In some embodiments, electronic device 100 may receive a user operation such that electronic device 100 may display, for each sensitive resource, a detail access record for that sensitive resource for other devices over time, including but not limited to: access time, access times, application running status of the application, etc.
8C-8D illustrate diagrams of detailed access records of other devices for a sensitive resource over a period of time. Fig. 8C-8D are also merely illustrative of the present application, and the user interface may vary in practice.
The one sensitive resource may be a location information sensitive resource, for example.
As shown in fig. 8C, the electronic device 100 receives an input operation (e.g., a single click) by a user for the control 802, and in response to the input operation by the user, the electronic device 100 may display a user interface 820 as described in fig. 8D. User interface 820 shows a detailed access record for other devices for location information sensitive resources over a period of time.
As shown in fig. 8D, user interface 610 illustrates a detailed access record for location information for different devices. The detail access record includes: today 11:25, the number of times the car machine accesses the position information on the electronic device 100 is 3. Today 09:03, the electronic device 200 accesses location information on the electronic device 100 1 time. Today 09:02, the smart band accesses location information on the electronic device 100 6 times. The number of times the car set accesses the location information on the electronic device 100 is not shown, 20:49 yesterday.
The electronic device 100 may receive a user up-and-down sliding operation on the user interface 820 such that the electronic device 100 displays more detailed access records for location information on the electronic device 100 by other devices. The embodiments of the present application are not described herein.
In some embodiments, the electronic device 100 may filter to obtain a part of the detailed access record according to the device type, time, access number, and other information of other devices. The electronic device 100 may also screen to obtain a part of the detailed access record in other manners, which is not limited in the embodiment of the present application.
Fig. 8E-8G schematically illustrate a schematic diagram of the electronic device 100 screening and displaying part of a detailed access record. Fig. 6D-6I are also merely illustrative of the present application, and the user interface may vary in practice.
As shown in fig. 8D, the electronic device 100 receives an input operation (e.g., a single click) of the filtering control 803 by the user, and in response to the input operation by the user, the electronic device 100 may display the prompt 804 as shown in fig. 8E. The prompt 804 is used to prompt the user to select the rule for screening. Illustratively, the reminder information 804 includes a filter time option and a filter device option. The electronic device 100 may receive an input operation of the user for a filtering time option, so that the electronic device 100 may filter a part of the detail access record based on a time (for example, today) selected by the user, and display the part of the detail access record. The electronic device 100 may also receive an input operation of the user for a filtering device option, so that the electronic device 100 may filter and obtain a part of the detail access record based on the device type selected by the user, and display the part of the detail access record.
For example, as shown in fig. 8E, the electronic device 100 may receive an input operation (e.g., a click) of the screening device option by the user, and in response to the input operation by the user, the electronic device 100 may display the prompt 805 as described in fig. 8F. The prompt 805 is used to prompt the user to select a device type, and the device type shown in the prompt 805 includes, for example, a car set, a smart band, and the electronic device 200.
As shown in fig. 8F, the electronic device 100 receives a user selection operation for the option of the car machine in the prompt 805, and the electronic device 100 will display the user interface 830 shown in fig. 8G. Wherein the user interface 830 shows a detailed access record of the vehicle machine for location information on the electronic device 100. For example, today 11:25, the number of times the vehicle accesses location information on the electronic device 100 is 10. Today 09:02, the number of times the car machine accesses the location information on the electronic device 100 is 6. 20:49 yesterday, the number of times the car set accesses the location information on the electronic device 100 is 1. 18:32 yesterday, the number of times the car set accesses the location information on the electronic device 100 is 1.
The electronic device 100 may receive a user up-down sliding operation with respect to the user interface 830, and in response to the user sliding operation, the electronic device 100 may display the number of times and time that more car machines access the location information on the electronic device 100. The embodiments of the present application are not described herein.
In some embodiments, electronic device 100 may receive a user operation such that electronic device 100 may display a detailed access record for a sensitive resource on electronic device 100 for a certain period of time, including but not limited to: access time, access times, application running status of the application, etc. Specifically, similar to the operations in fig. 6J-6P, the embodiments of the present application are not repeated here.
Optionally, in some implementations, in a multi-device interconnect scenario, the electronic device 100 may also record a record of the behavior of the electronic device 100 to access sensitive resources on other devices.
9A-9C illustrate diagrams of electronic device 100 displaying behavior records of other devices accessing sensitive resources on a local device. Fig. 9A-9C are also merely illustrative of the present application, and the user interface may vary in practice.
As shown in fig. 9A, the electronic apparatus 100 receives a sliding operation of a user acting on the access record histogram display area in a second direction (e.g., rightward), the first direction being different from the second direction, and in response to the sliding operation of the user, the electronic apparatus 100 displays a user interface 910 as shown in fig. 9B. User interface 910 differs from user interface 410 in that user interface 910 describes an access record bar graph for electronic device 100 to access sensitive resources on other devices, and user interface 410 describes an access record bar graph for applications installed locally on electronic device 100 to access locally sensitive resources.
Optionally, in some embodiments, the electronic device 100 may also receive an input operation (e.g., a click) by the user for the privacy preserving option in fig. 4A, and in response to the input operation by the user, the electronic device 100 may display a user interface 910 as shown in fig. 9B. That is, the electronic device 100 may also only display a schematic of the electronic device 100 displaying a record of the behavior of other devices accessing sensitive resources on the local device.
As shown in fig. 9B, the user interface 910 includes a tab set 901 and an access record bar graph 902. The tag group 901 shows device identifiers of the electronic device 100 accessing sensitive resources on other devices within a certain period of time, for example, but not limited to, a car set, the electronic device 200 and a smart bracelet. The access record bar 902 records the type and number of times the electronic device 100 accesses sensitive resources on board the vehicle. For example, the electronic device 100 accesses on-board media and files 50 times. The number of times the electronic device 100 accesses contacts on the vehicle is 27. The number of times the electronic device 100 accesses the on-board location information is 27.
The user interface 910 also includes a control 903 and a control 904. Where electronic device 100 may receive a user input operation (e.g., a single click) on control 903, in response to the user input operation, electronic device 100 may not display a record of behavior of electronic device 100 to access sensitive resources on other devices. The electronic device 100 may also receive an input operation (such as a click) of the control 904 by the user, and in response to the input operation by the user, the electronic device 100 may display a detailed record of the electronic device 100 accessing the sensitive resource on the other device, which is not described herein in detail in the embodiments of the present application.
The electronic device 100 may also receive an access record bar graph that the user operates to switch display that the electronic device 100 accesses sensitive resources on other devices (e.g., smartbands) for a period of time. The embodiments of the present application are not described herein.
In some embodiments, the user may choose to cause electronic device 100 to cease displaying behavior records for electronic device 100 accessing sensitive resources on other devices.
In some embodiments, the user may also choose to cause electronic device 100 to cease displaying access records for electronic device 100 accessing private resources on some other device. For example, the user may choose to close an access record of the electronic device 100 to access a private resource on the vehicle.
In some embodiments, the user may also choose to cause electronic device 100 to cease displaying access records for portions of electronic device 100 accessing portions of sensitive resources on other devices. For example, the user may choose to stop displaying access records for electronic device 100 to access a camera on the vehicle.
Specific operations may be referred to the descriptions in fig. 5A to 5H, and the embodiments of the present application are not repeated here.
In some embodiments, electronic device 100 may receive a user operation such that electronic device 100 may display, for a device, a detailed access record for electronic device 100 to access a sensitive resource on the device for a period of time, including but not limited to: the type of sensitive resource, the number of accesses, the access time, etc.
Fig. 9B-9C are schematic diagrams illustrating a detailed access record of an electronic device 100 accessing sensitive resources on a device over a period of time. Fig. 9B-9C are also merely illustrative of the present application, and the user interface may vary in practice.
The certain device may be a car machine, for example.
As shown in fig. 9B, the electronic device 100 receives an input operation (e.g., a single click) by a user for the control 904, and in response to the input operation by the user, the electronic device 100 may display a user interface 920 as described in fig. 9C. User interface 920 shows a detailed access record of electronic device 100 accessing various sensitive resources on board the vehicle.
As shown in fig. 9C, the detail access record includes: today 11:25, the number of times the electronic device 100 accesses the on-board media and files is 10. Today 09:03, the electronic device 100 accesses the on-board location information 1 time. Today 09:02, the number of times the electronic device 100 accesses an on-board contact is 6. The number of times electronic device 100 accessed contacts on the car was not shown, 20:49 yesterday.
The electronic device 100 may receive the up-down sliding operation of the user on the user interface 920, so that the electronic device 100 displays more detailed access records of the electronic device 100 for accessing various sensitive resources on the vehicle, which are not described herein.
In some embodiments, the electronic device 100 may screen to obtain a part of the detail access record according to information such as the type, access time, access number and the like of the sensitive resource on the vehicle, and display the part of the detail access record. The electronic device 100 may also screen out a portion of the detailed access record in other manners, which are not limited in this embodiment of the present application.
Optionally, in the scenario of multi-device interconnection, whether the electronic device 100 displays a record of behavior of other devices accessing sensitive resources on the electronic device 100, or the electronic device 100 displays a record of behavior of the electronic device 100 accessing sensitive resources on other devices, the electronic device 100 may also display an operation state of the electronic device 100, where the operation state of the electronic device 100 includes, but is not limited to, a screen-locked operation state and a screen-unlocked operation state.
For example, where the electronic device 100 displays a record of behavior of other devices accessing sensitive resources on the electronic device 100, referring to the embodiments illustrated in fig. 8A-8C described above, the electronic device 100 receives an input operation (e.g., a single click) by a user for the control 802, and in response to the input operation by the user, the electronic device 100 may display a user interface 1010 as illustrated in fig. 10A. The user interface 1010 shown in fig. 10A is similar to the user interface 820 shown in fig. 8D, except that the user interface 1010 shown in fig. 10A also shows the operating state of the electronic device 100 when other devices access location information on the electronic device 100. For example, today 11:25, the number of times the car machine accesses the position information on the electronic device 100 is 3 in the screen locking state of the electronic device 100. Today 09:03, the number of times the electronic device 200 accesses the location information on the electronic device 100 is 1 in the locked state of the electronic device 100. 09:02, the smart band accesses the location information on the electronic device 100 6 times when the electronic device 100 is in the non-locked state. The number of times the car set accesses the location information on the electronic device 100 is not shown, 20:49 yesterday.
For another example, where the electronic device 100 displays a record of behavior of the electronic device 100 to access sensitive resources on other devices, referring to the embodiments illustrated in fig. 9A-9B described above, the electronic device 100 receives an input operation (e.g., a single click) by a user for the control 904, and in response to the input operation by the user, the electronic device 100 may display a user interface 1020 as illustrated in fig. 10B. The user interface 1020 shown in fig. 10B is similar to the user interface 920 shown in fig. 9C, except that in the user interface 1020 shown in fig. 10B, the operation state of the electronic device 100 when the electronic device 100 accesses the on-board morning sensitive resource is also displayed. For example, today 11:25, in the state that the electronic device 100 is locked, the number of times the electronic device 100 accesses the media and the files on the vehicle is 10. The number of times the electronic device 100 accesses the on-board location information is 1 time when the electronic device 100 is in the screen locking state 09:03 today. 09:02, the number of times the electronic device 100 accesses a contact on the vehicle is 6 when the electronic device 100 is in the non-screen-locked state. The number of times electronic device 100 accessed contacts on the car was not shown, 20:49 yesterday.
It should be noted that, for the electronic device 100 to display a behavior record of accessing the sensitive resource on the electronic device 100 by other devices, or for the electronic device 100 to display a behavior record of accessing the sensitive resource on other devices by the electronic device 100, the electronic device 100 may also accurately time the access record to the hour, or display the behavior record based on a time period, so as to reduce redundancy of information displayed on an interface on the electronic device 100, which may be specifically described in the embodiment of fig. 6C, and embodiments of the application will not be repeated herein.
Next, how the electronic device 100 draws a histogram is described.
The electronic device 100 draws a histogram, i.e. the display height of each pillar in the histogram needs to be obtained. The display height of each column can be understood as the number of accesses to a sensitive resource.
Next, it is described how the electronic device 100 determines the display height of each pillar in the histogram.
The following embodiments of the present application are implemented in the principle of how electronic device 100 determines that an application installed on electronic device 100 accesses sensitive resources on a local device.
Illustratively, there are 5 sensitive resources on the electronic device 100. One histogram for each sensitive resource. And each bar graph includes a plurality of bars, each bar representing the number of times an application accesses the sensitive resource.
First, the electronic device 100 determines the application and the access number with the largest access number in each privacy resource, respectively. I.e. the electronic device 100 may determine the 5 maximum access times.
Thereafter, the electronic device 100 ranks from high to low from 5 access times. The electronic device 100 determines that the post height with the first number of access times is the first height, the post height with the second number of access times is the second height, the post height with the third number of access times is the third height, the post height with the fourth number of access times is the fourth height, and the post height with the fifth number of access times is the fifth height. If there are as many accesses, the column heights as many accesses are set to the same height. Wherein the first height is greater than the second height is greater than the third height and greater than the fourth height is greater than the fifth height. Illustratively, the first height is H, the second height is 0.875H, the third height is 0.75H, the fourth height is 0.625H, and the fifth height is 0.5H.
If only 4 kinds of sensitive resources have access records, the histogram is displayed. The electronic device 100 determines the application and the access times with the largest access times among the 4 kinds of privacy resources having the access records. I.e. the electronic device 100 may determine the 4 maximum access times.
Thereafter, the electronic device 100 orders from high to low again 4 access times. The electronic device 100 determines that the post height with the first most access times is the first height, the post height with the second most access times is the second height, the post height with the third most access times is the third height, and the post height with the fourth most access times is the fourth height. If there are as many accesses, the column heights as many accesses are set to the same height. Wherein the first height is greater than the second height and greater than the third height and greater than the fourth height. Illustratively, the first height is H, the second height is 0.833H, the third height is 0.666H, and the fourth height is 0.5H.
If only 3 kinds of sensitive resources have access records, the bar graph is displayed. The electronic device 100 determines the application and the access times with the largest access times among the 3 kinds of privacy resources having the access records, respectively. I.e. the electronic device 100 may determine the 3 maximum access times.
Thereafter, the electronic device 100 ranks from high to low for 3 access times. The electronic device 100 determines that the post height with the first most number of accesses is the first height, the post height with the second most number of accesses is the second height, and the post height with the third most number of accesses is the third height. If there are as many accesses, the column heights as many accesses are set to the same height. Wherein the first height is greater than the second height and greater than the third height. Illustratively, the first height is H, the second height is 0.75H, and the third height is 0.5H.
If only 2 kinds of sensitive resources have access records, the bar graph is displayed. The electronic device 100 determines the application and the access times with the largest access times among the 2 kinds of privacy resources having the access records, respectively. I.e. the electronic device 100 may determine 2 maximum access times.
Thereafter, the electronic device 100 ranks from high to low for 2 access times. The electronic device 100 determines the column height with the first most number of accesses to be the first height and the column height with the second most number of accesses to be the second height. Wherein the first height is greater than the second height. Illustratively, the first height is H and the second height is 0.5H.
If only 1 sensitive resource has access records in the 5 sensitive resources, a histogram is displayed. The electronic device 100 determines the application and the access number with the largest access number among the 1 types of privacy resources having the access record, respectively. I.e. the electronic device 100 may determine 1 maximum number of accesses.
For example, for 5 sensitive rights, the most frequent accesses of 5 sensitive resources are [10,10,100,100,1000] from high to low, and then the height of the highest column of the histogram in each sensitive right can be determined to be [ 0.245 h,0.875h, h ].
After determining the pillar heights of the applications in the histogram that have the greatest number of accesses in each type of sensitive resource, the electronic device 100 may determine the pillar heights of the applications in the histogram corresponding to the other remaining number of accesses in each histogram,
specifically, the electronic device 100 may determine the pillar height applied in the histogram corresponding to the other access times based on the ratio between the other access times and the most access times and the pillar height with the most access times in the histogram, so as to obtain a complete histogram, which is not described herein in detail in this embodiment.
In other scenarios, how the electronic device 100 draws the histogram is similar in principle, and reference may be made to the description of the above embodiments, which are not repeated here.
Optionally, when the histogram is displayed, a tag group is further displayed above the histogram, where the tag group includes icons of a plurality of sensitive resources. When the electronic device 100 receives an input operation (e.g., a click) of an icon of a sensitive resource by a user, the electronic device 100 may display the icon and text of the sensitive resource. However, since the text of some sensitive resources is long, the tag cannot completely wrap the icon and text of the sensitive resources, and the electronic device 100 only displays part of the text in the tag, so that the text does not exceed the range of the tag. The above is merely exemplified by chinese, and in other languages, such as english or arabic, the icon text becomes longer when the chinese is changed to the english or arabic, and the icon text may exceed the scope of the label.
The embodiment of the application aims to solve the above problem, when a user clicks an icon of a sensitive resource, the icon is not limited to Chinese, the electronic device 100 can display the icon and the text of the sensitive resource simultaneously in multiple languages, and the font size of the text of the sensitive resource and/or the length of the label change along with the change of the text length of the sensitive resource, so that the label can completely wrap the icon and the text of the sensitive resource.
Specifically, each label is preset with an initial minimum width w1 and an initial maximum width w2, and each label is preset with an initial minimum font ts1 and an initial maximum font ts2. Where w2> =w1, ts2> =ts1.
When the electronic device 100 switches the language to another language (for example, arabic), after the electronic device 100 receives an input operation of a user for an icon of a certain privacy resource (for example, an icon of a camera), the electronic device 100 may first set the font of the arabic of the camera to ts2, set the width of the tag to w1, determine whether the width of the icon of the camera and the width of the arabic of the camera exceed w1, and if not, display the icon of the camera and the arabic of the camera in the tag with the set parameters. If so, the font of the Arabic of the camera is reduced, whether the width of the icon of the camera and the width of the Arabic of the camera exceed w1 is judged again, and if not, the icon of the camera and the Arabic of the camera are displayed in the label by the newly set parameters. If the font of the Arabic of the camera is reduced again until the font of the Arabic of the camera is reduced to the minimum value ts1.
If the width of the camera icon and the width of the camera arabic are still more than w1 after the font of the camera arabic is reduced to the minimum value ts1, the width of the tag is increased, whether the width of the camera icon and the width of the camera arabic are more than w1 is judged again, and if not, the camera icon and the camera arabic are displayed in the tag with newly set parameters. If it exceeds, the width of the label is increased again until the width of the label is increased to the maximum value w2.
If the width of the tag increases to the maximum value w2 and the font of the camera's arabic is reduced to the minimum value ts1, the tag cannot completely wrap the camera's arabic. The electronic device 100 displays only a portion of the arabic text within the tag and a portion of the arabic text beyond the tag is not displayed.
As shown in fig. 11, in the arabic language environment, when a user clicks on an icon of a sensitive resource, the font size of the text of the sensitive resource and/or the length of the tag changes with the text length of the sensitive resource, so that the tag can completely wrap the icon and text of the sensitive resource, and thus the electronic device 100 can display the icon and text of the privacy resource in the tag at the same time.
Optionally, after the electronic device 100 adjusts the font size of the text and/or the length of the tag of the sensitive resource, the electronic device 100 may store the font size of the sensitive resource and the size of the tag, and the electronic device 100 may not need to recalculate and directly use the text and/or the tag when displaying the text and/or the tag next time, thereby saving processing time delay and reducing the calculation amount of the electronic device 100.
The embodiment of the application is only described by taking Arabic characters as an example, and the embodiment of the application does not limit the type of language. The other languages may also be English, spanish, french, portuguese, russian, german, tamier, hindi, and the like.
Fig. 12 is a flow chart of a method for recording access behavior of sensitive resources according to an embodiment of the present application.
S1201, the first electronic device displays a first interface, where a first graphic is displayed on the first interface, where the first graphic is used to represent a number of times that one or more application programs access a first sensitive resource on the first electronic device.
The first electronic device may be the electronic device 100.
The first interface may be the user interface 410 shown in fig. 4B.
The first graphic may be an access record bar 402 shown in the user interface 410.
The first sensitive resource may be a camera sensitive resource.
S1202, the first electronic device receives a first operation for the first interface.
The first operation may be an input operation for control 404 shown in fig. 6A.
S1203, in response to the first operation, the first electronic device displays a second interface, where a plurality of behavior records of the one or more applications accessing the first sensitive resource on the first electronic device are displayed.
The second interface may be the user interface 610 shown in fig. 6B.
Wherein the behavior record includes an operating state of the one or more applications, the operating state of the one or more applications including: the screen locking front stage running state, the screen locking back stage running state, the non-screen locking front stage running state and the non-screen locking back stage running state.
Alternatively, the operating states of the one or more applications may include: the system comprises a screen locking running state, a non-screen locking foreground running state and a non-screen locking background running state. That is, in the lock state, it is no longer possible to distinguish whether the application is running in the foreground or in the background.
Alternatively, the operating states of the one or more applications may include: a screen locking state and a non-screen locking state. That is, in the lock state and the non-lock state, it is no longer possible to distinguish whether the application is running in the foreground or in the background.
Alternatively, the operating states of the one or more applications may include: foreground running state and background running state. That is, it is no longer discriminated whether the application is in the lock state or the non-lock state.
By the method for recording the access behaviors of the sensitive resources, the first electronic device can display the behaviors of one or more application programs for accessing each privacy resource under different running states to the user, so that the user can know the behaviors of the application programs simply, conveniently, comprehensively and intuitively.
In one possible implementation, the behavior record further includes one or more of the following: the name of the one or more applications, the time the one or more applications accessed the first sensitive resource, the number of times the one or more applications accessed the first sensitive resource.
Alternatively, the time for one or more applications to access the first sensitive resource may be a period of time, such as a first time to a second time. In this way, the redundancy of the first electronic device displaying information on the user interface may be reduced.
In one possible implementation, the second interface further includes a first control displayed on the second interface, and after the first electronic device displays the second interface, the method further includes: the first electronic device receives a second operation for the first control; in response to the second operation, the first electronic device displays a third interface, wherein the third interface displays a plurality of behavior records of the running state of one or more application programs for accessing the first sensitive resource on the first electronic device under the first running state.
The first control may be a filter control 601 shown in fig. 6D.
The second operation may be an input operation for the filter control 601.
The third interface may be the user interface 630 shown in fig. 6I.
The first operational state may be a non-lock screen foreground operational state.
In this way, the user can filter out a record of behavior that causes the first electronic device to only display access to sensitive resources by the application under a certain operating state.
In one possible implementation, a second control is also displayed on the first interface; after the first electronic device displays the first interface, the method further comprises: the first electronic device receives a third operation for the second control; in response to the third operation, the first electronic device ceases displaying the second control and the first graphic on the first interface.
The second control may be control 403 shown in fig. 5A.
The third operation may be an input operation for control 403.
The first electronic device ceasing to display the second control and the first graphic on the first interface may be the user interface 510 described in fig. 5B.
In this way, the user may operate to cause the first electronic device to cease displaying the number of times the one or more applications access the first sensitive resource on the first electronic device, protecting the privacy of the user.
In one possible implementation, the first interface further displays an icon of the first application program; after the first electronic device displays the first interface, the method further comprises: the first electronic device receives a fourth operation of the icon for the first application program; in response to the fourth operation, the first electronic device displays a second graphic on the first interface, the second graphic excluding the number of times the first application accesses the first sensitive resource on the first electronic device.
The icon of the first application may be an instant messaging application icon shown in fig. 5F.
The fourth operation may be an input operation for the instant messaging application icon shown in fig. 5F.
The second graphic may be a bar graph shown in the user interface 530.
In this way, the user may operate such that the first electronic device does not display on the interface the number of times that a portion of the application (e.g., the first application program) accesses the first sensitive resource on the first electronic device, protecting the user's privacy.
In one possible implementation, the first interface also displays an option of a first sensitive resource; after the first electronic device displays the first interface, the method further comprises: the first electronic device receiving a fifth operation for an option of the first sensitive resource; in response to the fifth operation, the first electronic device stops displaying the first graphic.
The option for the first sensitive resource may be a camera icon as shown in fig. 5C.
The fifth operation may be an input operation for the camera icon shown in fig. 5C.
In this way, the user may operate such that the first electronic device does not display on the interface the number of times the application accesses a certain privacy resource (e.g., a first sensitive resource) on the first electronic device, protecting the user's privacy.
Alternatively, after the first electronic device stops displaying the first graphic, the first electronic device may automatically display the number of times the one or more applications access another sensitive resource on the first electronic device. For example, may be the user interface 520 shown in fig. 5E.
In one possible implementation, the first interface also displays an option of a second sensitive resource; after the first electronic device displays the first interface, the method further comprises: the first electronic device receiving a sixth operation for the option of the second sensitive resource; in response to the sixth operation, the first electronic device displays a third graphic on the first interface, the third graphic representing a number of times the one or more applications access the second sensitive resource on the first electronic device.
The option for the second sensitive resource may be an icon for the camera shown in fig. 4B.
The sixth operation may be an input operation for an icon of the camera in the tag group 401 shown in fig. 4B.
The third graphic may be a bar graph in the user interface 420 shown in fig. 4C.
The second sensitive resource may be a camera sensitive resource.
In this way, the first electronic device may switch to display the number of times one or more applications access different privacy resources on the first electronic device.
In one possible implementation, the name of the second sensitive resource in the first language is displayed in the option of the second sensitive resource.
In one possible implementation, the method further includes: after the first electronic device switches the system language from the first language to the second language, the first electronic device displays the name of the second sensitive resource of the second language in the options of the second sensitive resource, wherein the length of the name of the second sensitive resource of the second language is greater than the length of the name of the second sensitive resource of the first language, wherein the length of the options of the second sensitive resource in the second language is greater than the length of the options of the second sensitive resource in the first language, and/or the font size of the name of the second sensitive resource in the second language is smaller than the font size of the name of the second sensitive resource in the first language.
Optionally, icons of the sensitive resources are also displayed in the options of the sensitive resources.
The second language may be an arabic language.
The option for the second sensitive resource may be a tag as described in the embodiment of fig. 11.
Therefore, after the first electronic equipment switches the system language, the first electronic equipment can adaptively adjust the font size of the name of the sensitive resource and/or the length of the option of the sensitive resource, so that the first electronic equipment can completely display the name of the sensitive resource.
Specifically, reference may be made to the description in the embodiment of fig. 11, and the embodiment of the present application will not be repeated here.
In one possible implementation, after the first electronic device displays the first interface, the method further includes: after the first electronic device detects a seventh operation aiming at the first interface, displaying a third interface, wherein a plurality of behavior records of the first electronic device accessing one or more sensitive resources on the first external device are displayed in the third interface; wherein the behavioral record includes one or more of the following: the method comprises the steps of operating state of the first electronic device, name of the first external device, time of the first electronic device accessing one or more sensitive resources on the first external device, and number of times of the first electronic device accessing the one or more sensitive resources on the first external device.
The seventh operation may be a sliding operation in the second direction (e.g., rightward) that acts on the access record histogram display area shown in fig. 9A, and an input operation for the control 904 in the user interface 910 shown in fig. 9B.
The third interface may be the user interface 920 shown in fig. 9C.
The name of the first external device is a car machine.
In this way, in a multi-device interconnection scenario, the first electronic device may switch to display multiple behavior records of the first electronic device accessing one or more sensitive resources on the first external device.
In one possible implementation, after the first electronic device displays the first interface, the method further includes:
after the first electronic device detects the eighth operation aiming at the first interface, displaying a fourth interface, wherein a plurality of behavior records of one or more external devices accessing the first sensitive resource on the first electronic device are displayed in the fourth interface;
wherein the behavioral record includes one or more of the following: the method comprises the steps of operating state of the first electronic device, names of one or more external devices, time of the one or more external devices accessing the first sensitive resource on the first electronic device, and number of times of the one or more external devices accessing the first sensitive resource on the first electronic device.
The eighth operation may be a sliding operation in the first direction (e.g., to the left) that acts on the access record histogram display area shown in fig. 8A, and an input operation for the control 802 in the user interface 810 shown in fig. 8C.
The fourth interface may be the user interface 820 shown in fig. 8D.
The names of the one or more external devices may be car set, electronic device 200, and smart bracelet.
In this way, in the scenario of multi-device interconnection, the first electronic device may switch and display multiple behavior records of one or more external devices accessing one or more sensitive resources on the first electronic device.
In one possible implementation, the type of the first sensitive resource includes any one of the following: location information, cameras, microphones, telephones, address books, text messages, storage, calendars, memos, sports health data, photo albums, media and files, music, bluetooth.
In one possible implementation, the type of external device is any one of the following: cell-phone, intelligent bracelet, car machine, bluetooth headset, intelligent large-size screen, intelligent wrist-watch.
The application also provides another sensitive resource access behavior recording method, which comprises the following steps: the first electronic device displays a fifth interface, and a fourth graph is displayed on the fifth interface, wherein the fourth graph is used for representing the number of times the first electronic device accesses one or more sensitive resources on the first external device; the first electronic device receives a ninth operation for the fifth interface; responding to the ninth operation, the first electronic device displays a third interface, and a plurality of behavior records of the first electronic device accessing one or more sensitive resources on the first external device are displayed in the third interface; wherein the behavioral record includes one or more of the following: the method comprises the steps of operating state of the first electronic device, name of the first external device, time of the first electronic device accessing one or more sensitive resources on the first external device, and number of times of the first electronic device accessing the one or more sensitive resources on the first external device.
The fifth interface may be the user interface 910 shown in fig. 9B.
The fourth graphic may be a bar graph in the user interface 910 shown in fig. 9B.
The first external device may be a car machine.
The ninth operation may be an input operation for control 904 in user interface 910.
The third interface may be the user interface 920 shown in fig. 9C.
In this way, the first electronic device may also display only a plurality of behavior records of the first electronic device accessing one or more sensitive resources on the external device.
One or more possible implementation manners provided in the embodiment of fig. 12 are also applicable to the method for recording the access behavior of a sensitive resource described herein, and the embodiment of the present application is not repeated herein.
The application also provides a method for recording the access behavior of the sensitive resource, which comprises the following steps: the first electronic device displays a sixth interface, and a fifth graph is displayed on the sixth interface, wherein the fifth graph is used for representing the number of times that one or more external devices access a first sensitive resource on the first electronic device; the first electronic device receives a tenth operation for the sixth interface; responding to the tenth operation, the first electronic device displays a fourth interface, and the fourth interface displays a plurality of behavior records of one or more external devices accessing the first sensitive resource on the first electronic device; wherein the behavioral record includes one or more of the following: the method comprises the steps of operating state of the first electronic device, names of one or more external devices, time of the one or more external devices accessing the first sensitive resource on the first electronic device, and number of times of the one or more external devices accessing the first sensitive resource on the first electronic device.
The sixth interface may be the user interface 810 shown in fig. 8B.
The fifth graphic may be a bar graph in the user interface 810 shown in fig. 8B.
The one or more external devices may be a car set, an electronic device 200, and a smart bracelet.
The tenth operation may be an input operation for control 802 in user interface 810.
The fourth interface may be the user interface 820 shown in fig. 8D.
In this way, the first electronic device may also display only a plurality of behavior records of the one or more external devices accessing the one or more sensitive resources on the first electronic device.
One or more possible implementation manners provided in the embodiment of fig. 12 are also applicable to the method for recording the access behavior of a sensitive resource described herein, and the embodiment of the present application is not repeated herein.
The embodiments of the present application may be arbitrarily combined to achieve different technical effects.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
In summary, the foregoing description is only exemplary embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made according to the disclosure of the present invention should be included in the protection scope of the present invention.

Claims (18)

1. A method for recording access behavior of sensitive resources, the method comprising:
the method comprises the steps that first electronic equipment displays a first interface, wherein a first graph is displayed on the first interface, and the first graph is used for representing the number of times that one or more application programs access a first sensitive resource on the first electronic equipment;
the first electronic device receives a first operation for the first interface;
In response to the first operation, the first electronic device displays a second interface in which a plurality of behavior records of the one or more application programs accessing the first sensitive resource on the first electronic device are displayed;
wherein the behavior record includes an operating state of the one or more applications, the operating state of the one or more applications including: the screen locking front stage running state, the screen locking back stage running state, the non-screen locking front stage running state and the non-screen locking back stage running state.
2. The method of claim 1, wherein the behavioral record further comprises one or more of: the name of the one or more applications, the time the one or more applications accessed the first sensitive resource, the number of times the one or more applications accessed the first sensitive resource.
3. The method of claim 1 or 2, wherein the second interface further has a first control displayed thereon, and wherein after the first electronic device displays the second interface, the method further comprises:
the first electronic device receives a second operation for the first control;
And responding to the second operation, the first electronic equipment displays a third interface, wherein the running state of the one or more application programs displayed in the third interface accesses a plurality of behavior records of the first sensitive resource on the first electronic equipment under the first running state.
4. A method according to any one of claims 1-3, wherein a second control is also displayed on the first interface; after the first electronic device displays the first interface, the method further includes:
the first electronic device receives a third operation for the second control;
and in response to the third operation, the first electronic device stops displaying the second control and the first graph on the first interface.
5. A method according to any one of claims 1-3, wherein an icon of a first application is also displayed on the first interface; after the first electronic device displays the first interface, the method further includes:
the first electronic device receiving a fourth operation for an icon of the first application;
in response to the fourth operation, the first electronic device displays a second graph on the first interface, wherein the second graph does not comprise the times that the first application program accesses the first sensitive resource on the first electronic device.
6. A method according to any of claims 1-3, wherein the first interface is further displayed with an option for the first sensitive resource; after the first electronic device displays the first interface, the method further includes:
the first electronic device receiving a fifth operation for an option of the first sensitive resource;
in response to the fifth operation, the first electronic device stops displaying the first graphic.
7. The method of any of claims 1-6, wherein the first interface further displays an option for a second sensitive resource;
after the first electronic device displays the first interface, the method further includes:
the first electronic device receiving a sixth operation for an option of the second sensitive resource;
in response to the sixth operation, the first electronic device displays a third graphic on the first interface, the third graphic representing a number of times the one or more applications access the second sensitive resource on the first electronic device.
8. The method of claim 7, wherein the second sensitive resource's name in the first language is displayed in the second sensitive resource's option.
9. The method of claim 8, wherein the method further comprises:
after the first electronic device switches the system language from the first language to a second language, the first electronic device displays a name of the second sensitive resource in the second language in an option of the second sensitive resource, a length of the name of the second sensitive resource in the second language being greater than a length of the name of the second sensitive resource in the first language,
the length of the option of the second sensitive resource in the second language is larger than the length of the option of the second sensitive resource in the first language, and/or the font size of the name of the second sensitive resource in the second language is smaller than the font size of the name of the second sensitive resource in the first language.
10. The method of any of claims 1-9, wherein after the first electronic device displays a first interface, the method further comprises:
after the first electronic device detects a seventh operation aiming at the first interface, displaying a third interface, wherein a plurality of behavior records of the first electronic device accessing one or more sensitive resources on a first external device are displayed in the third interface;
Wherein the behavioral record includes one or more of the following: the running state of the first electronic device, the name of the first external device, the time when the first electronic device accesses the one or more sensitive resources on the first external device, and the number of times when the first electronic device accesses the one or more sensitive resources on the first external device.
11. The method of any of claims 1-9, wherein after the first electronic device displays a first interface, the method further comprises:
after the first electronic device detects the eighth operation on the first interface, displaying a fourth interface, wherein a plurality of behavior records of one or more external devices accessing the first sensitive resource on the first electronic device are displayed in the fourth interface;
wherein the behavioral record includes one or more of the following: the method comprises the steps of operating state of the first electronic device, names of one or more external devices, time for the one or more external devices to access the first sensitive resource on the first electronic device, and the number of times for the one or more external devices to access the first sensitive resource on the first electronic device.
12. The method according to any of claims 1-11, wherein the type of the first sensitive resource comprises any of: location information, cameras, microphones, telephones, address books, text messages, storage, calendars, memos, sports health data, photo albums, media and files, music, bluetooth.
13. The method according to any one of claims 1-12, wherein the type of external device is any one of the following: cell-phone, intelligent bracelet, car machine, bluetooth headset, intelligent large-size screen, intelligent wrist-watch.
14. A method for recording access behavior of sensitive resources, the method further comprising:
the first electronic equipment displays a fifth interface, and a fourth graph is displayed on the fifth interface, wherein the fourth graph is used for representing the times that the first electronic equipment accesses one or more sensitive resources on the first external equipment;
the first electronic device receiving a ninth operation for the fifth interface;
responding to the ninth operation, the first electronic device displays a third interface, and a plurality of behavior records of the first electronic device accessing one or more sensitive resources on the first external device are displayed in the third interface;
Wherein the behavioral record includes one or more of the following: the running state of the first electronic device, the name of the first external device, the time when the first electronic device accesses the one or more sensitive resources on the first external device, and the number of times when the first electronic device accesses the one or more sensitive resources on the first external device.
15. A method for recording access behavior of sensitive resources, the method further comprising:
the method comprises the steps that a first electronic device displays a sixth interface, and a fifth graph is displayed on the sixth interface, wherein the fifth graph is used for representing the number of times that one or more external devices access a first sensitive resource on the first electronic device;
the first electronic device receives a tenth operation for the sixth interface;
responding to the tenth operation, the first electronic device displays a fourth interface, and a plurality of behavior records of one or more external devices accessing the first sensitive resource on the first electronic device are displayed in the fourth interface;
wherein the behavioral record includes one or more of the following: the method comprises the steps of operating state of the first electronic device, names of one or more external devices, time for the one or more external devices to access the first sensitive resource on the first electronic device, and the number of times for the one or more external devices to access the first sensitive resource on the first electronic device.
16. An electronic device, which is a first electronic device, characterized in that the first electronic device comprises: one or more processors, one or more memories; the one or more memories coupled with the one or more processors, the one or more memories to store computer program code comprising computer instructions that the one or more processors invoke to cause the first electronic device to perform the method of any of the above claims 1-13.
17. A computer readable storage medium storing computer instructions which, when run on a first electronic device, cause the first electronic device to perform the method of any of the preceding claims 1-13.
18. A computer program product, characterized in that the computer program product, when run on a first electronic device, causes the first electronic device to perform the method of any of the preceding claims 1-13.
CN202210847377.5A 2022-07-19 2022-07-19 Sensitive resource access behavior recording method and electronic equipment Pending CN117453086A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210847377.5A CN117453086A (en) 2022-07-19 2022-07-19 Sensitive resource access behavior recording method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210847377.5A CN117453086A (en) 2022-07-19 2022-07-19 Sensitive resource access behavior recording method and electronic equipment

Publications (1)

Publication Number Publication Date
CN117453086A true CN117453086A (en) 2024-01-26

Family

ID=89584200

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210847377.5A Pending CN117453086A (en) 2022-07-19 2022-07-19 Sensitive resource access behavior recording method and electronic equipment

Country Status (1)

Country Link
CN (1) CN117453086A (en)

Similar Documents

Publication Publication Date Title
KR102470275B1 (en) Voice control method and electronic device
WO2021129326A1 (en) Screen display method and electronic device
CN114467297B (en) Video call display method and related device applied to electronic equipment
CN110058777B (en) Method for starting shortcut function and electronic equipment
CN110114747B (en) Notification processing method and electronic equipment
CN110119296B (en) Method for switching parent page and child page and related device
US20220413695A1 (en) Split-screen display method and electronic device
CN110825469A (en) Voice assistant display method and device
CN112148400B (en) Display method and device in locking state
CN111543042B (en) Notification message processing method and electronic equipment
CN115866122A (en) Application interface interaction method, electronic device and computer-readable storage medium
CN110633043A (en) Split screen processing method and terminal equipment
CN111602108B (en) Application icon display method and terminal
EP3964932A1 (en) Learning-based keyword search method, and electronic device
WO2020192761A1 (en) Method for recording user emotion, and related apparatus
CN112068907A (en) Interface display method and electronic equipment
CN112698756A (en) Display method of user interface and electronic equipment
WO2024045801A1 (en) Method for screenshotting, and electronic device, medium and program product
CN113010076A (en) Display element display method and electronic equipment
CN114077365A (en) Split screen display method and electronic equipment
CN115904160A (en) Icon moving method, related graphical interface and electronic equipment
CN116048243B (en) Display method and electronic equipment
CN113438366A (en) Information notification interaction method, electronic device and storage medium
CN117453086A (en) Sensitive resource access behavior recording method and electronic equipment
CN116991274B (en) Upper sliding effect exception handling method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination