CN117744068A - Trusted user interface display method, trusted user interface display equipment and storage medium - Google Patents

Trusted user interface display method, trusted user interface display equipment and storage medium Download PDF

Info

Publication number
CN117744068A
CN117744068A CN202311511271.9A CN202311511271A CN117744068A CN 117744068 A CN117744068 A CN 117744068A CN 202311511271 A CN202311511271 A CN 202311511271A CN 117744068 A CN117744068 A CN 117744068A
Authority
CN
China
Prior art keywords
touch screen
tui
trusted
virtual machine
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311511271.9A
Other languages
Chinese (zh)
Inventor
陈平原
杨晓明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311511271.9A priority Critical patent/CN117744068A/en
Publication of CN117744068A publication Critical patent/CN117744068A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/52Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity ; Preventing unwanted data erasure; Buffer overflow
    • G06F21/53Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity ; Preventing unwanted data erasure; Buffer overflow by executing in a restricted environment, e.g. sandbox or secure virtual machine
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • G06F2009/45579I/O management, e.g. providing access to device drivers or storage

Abstract

The application provides a trusted user interface display method, trusted user interface display equipment and a storage medium, and relates to the technical field of communication. In this scheme, in a scenario where multiple trusted execution environments TEE (e.g., there are two trusted virtual machines TVM1 and TVM 2) cooperatively provide a trusted user interface TUI, TVM2 pre-loads a touch screen service and a touch screen driver, once the touch screen driver monitors that a user inputs touch screen data in the TUI, TVM2 immediately sends the touch screen data to TVM1 without the TVM1 periodically polling TVM2 for touch screen data. Through the scheme after this application improves, can simplify the business process, acquire touch screen data fast, and can not lose effectual user touch screen operation to can avoid TVM 1's a large amount of initiative inquiry and unnecessary interaction, promote data interaction efficiency, promote user experience.

Description

Trusted user interface display method, trusted user interface display equipment and storage medium
The application is a divisional application of China patent application which is submitted to the national intellectual property office, the application number is 202210906314.2 and the application name is 'touch screen data processing method, device and storage medium based on trusted execution environment' on the day of 7 and 29 of 2022.
Technical Field
The present disclosure relates to the field of communications technologies, and in particular, to a touch screen data processing method, an electronic device, and a storage medium based on a trusted execution environment.
Background
With rapid development of terminal intelligence, mobile terminals are increasingly involved in sensitive information such as trade secrets and personal privacy, and various security threats are faced by the mobile terminals. A hacker may obtain sensitive information (such as a user name, a password, a card number, etc.) of a user by cracking a system, a malicious input method, a key log, a screen capture, etc., which threatens the security of a customer account.
Currently, trusted execution environment (trusted execution environment, TEE) technology provides trusted user interface (trusted user interface, TUI) functionality. By operating on the TUI, the user can safely input the sensitive information, a safe channel is formed among the user input interface, the system and the application program, the sensitive information can be safely transmitted to the application layer, and the sensitive information input by the user is ensured not to be stolen.
In the scheme where multiple TEEs cooperatively provide a TUI, touch panel data (TP) needs to be transferred from one trusted virtual machine (trusted virtual machine, TVM) to another TVM. The two trusted virtual machines are respectively marked as TVM1 and TVM2, and the TVM1 needs to use a thread or a timer to actively poll the TVM2 for touch screen data according to a preset frequency or period. That is, each time TVM1 obtains touch screen data, the following interaction procedure needs to be performed with TVM 2: the TVM1 actively sends a touch screen data query request to the TVM 2; TVM2 returns a status to TVM1 indicating that the request has been received; when TVM2 receives a touch screen input, TVM2 returns touch screen data to TVM 1. Therefore, the TP data interaction flow between the two TVM has the problems of high implementation complexity, more interaction times, poor real-time performance and the like, so that the interaction efficiency is lower.
Disclosure of Invention
The touch screen data processing method, the electronic device and the storage medium based on the trusted execution environment can simplify the flow of interaction touch screen data among a plurality of trusted virtual machines TVM applied to the trusted execution environment and improve the data interaction efficiency.
In order to achieve the above purpose, the present application adopts the following technical scheme:
in a first aspect, the present application provides a touch screen data processing method based on a trusted execution environment, which is applied to an electronic device, where the electronic device includes a main virtual machine TVM, a first trusted virtual machine TVM and a second trusted virtual machine TVM, the main virtual machine TVM is applied to a complex execution environment REE, and the first trusted virtual machine TVM and the second trusted virtual machine TVM are applied to a trusted execution environment TEE, and the method includes:
when an operating system of the electronic device is started, the second trusted virtual machine TVM loads a touch screen service for a trusted user interface TUI and a touch screen driver, and the touch screen driver is used for monitoring whether touch screen data exists in the trusted user interface TUI;
when the touch screen driver monitors that a user inputs touch screen data in the trusted user interface TUI, the second trusted virtual machine TVM sends the touch screen data to the first trusted virtual machine TVM.
The first trusted virtual machine TVM may be denoted as TVM1 and applied to the first TEE, and the second trusted virtual machine TVM may be denoted as TVM2 and applied to the second TEE. The first TEE and the second TEE cooperatively provide a trusted user interface TUI service.
According to the scheme, in a scene that a plurality of trusted execution environments TEEs (such as the two trusted virtual machines TVM1 and TVM 2) cooperatively provide a trusted user interface TUI, the TVM2 loads a touch screen service and a touch screen driver in advance, and once the touch screen driver monitors that a user inputs touch screen data in the TUI, the TVM2 immediately sends the touch screen data to the TVM1 without periodically polling the TVM2 for the touch screen data. Through the scheme after this application improves, can simplify the business process, acquire touch screen data fast, and can not lose effectual user touch screen operation to can avoid TVM 1's a large amount of initiative inquiry and unnecessary interaction, promote data interaction efficiency, promote user experience.
The host virtual machine may be a virtual machine that adopts an android operating system and operates in a complex execution environment.
In the embodiment of the present application, TVM1 is responsible for completing the generation of the graphical interface according to the application specification information, and TVM2 is responsible for displaying the graphical interface generated by TVM1. The touch screen driver of the TVM2 collects touch screen data and sends the touch screen data to the TVM1 through a message channel. The touch screen data may include touch screen position information (X, Y) and event information such as UP/DOWN. The location information is used to indicate the location of the touch screen operation, e.g., a specific location on the virtual keyboard where the touch screen operation is performed, from which it can be determined which numbers or letters or characters in the virtual keyboard the user has selected. The event information is used for indicating the event type of the touch screen; where DOWN indicates the start of the gesture event and UP indicates the end of the gesture event.
After the TVM1 receives the touch screen data of the TVM2, the TVM1 further responds according to the touch screen position information (X, Y), UP/DOWN and other event information, for example, determines the position of the keyboard clicked by the user according to the touch screen position information (X, Y), and determines the content to be displayed in the input box of the TUI according to the determination result. Wherein the content to be displayed may be a combination of letters, numbers and/or characters. The content to be displayed may be any of the following: user name, account number and password, bank account number.
In some possible implementations, the method further includes: when the operating system of the electronic device switches to TUI mode, the second trusted virtual machine TVM triggers the touch screen driver to enable TUI mode. After the touch screen driver enables the TUI mode, the second trusted virtual machine TVM continuously listens for touch screen data through the touch screen driver.
In some possible implementations, the method further includes: when the operating system of the electronic device exits the TUI mode, the second trusted virtual machine TVM triggers the touch screen driver to exit the TUI mode. And after exiting the TUI mode, the touch screen driver stops monitoring whether touch screen data exists or not.
The touch screen service and touch screen driver for the trusted user interface TUI is preloaded in TVM 2. Once the screen display content of the electronic device is switched to the TUI interface (enters the TUI mode), the touch screen driver detects whether a valid user touch screen operation exists in real time, and when the touch screen driver detects the valid user touch screen operation, the touch screen driver acquires touch screen data, and the TVM2 of the second TEE actively transmits the touch screen data to the TVM1 of the first TEE.
Thus, once the TVM2 side detects a valid user touch screen operation, TVM2 actively submits touch screen data to TVM1 without requiring TVM1 to request touch screen data from TVM2 multiple times. Wherein, TVM1 only needs to monitor the touch screen data from TVM2, and does not need to actively poll TVM 2.
The transmission of the touch screen data stream can be optimized by improving the current software implementation and workflow: the interaction flow of touch screen data between two TVM can be simplified, and the transmission of touch screen data can be completed through one active notification. In actual implementation, the TVM1 of the first TEE does not need to interact with the TVM2 of the second TEE for multiple times, so that the implementation flow is simplified.
In some possible implementations, the second trusted virtual machine TVM sending the touch screen data to the first trusted virtual machine TVM includes: and the second trusted virtual machine TVM sends the touch screen data to the first trusted virtual machine TVM through a first message channel. The first message channel is a mode of realizing data transmission through socket.
In some possible implementations, the method further includes: when the operating system of the electronic device is started, the second trusted virtual machine TVM loads a TUI display driver.
In some possible implementations, the method further includes: the method comprises the steps that a first user interface is displayed by a second trusted virtual machine TVM, and the first user interface is a trusted user interface; a second trusted virtual machine TVM receives touch screen operation of a user in the first user interface; and responding to the touch screen operation of the user, calling a touch screen driver by the TVM, and collecting the touch screen data.
The method and the device can avoid a large amount of active inquiry of the TVM 1. Wherein, because part of queries may be invalid when the TVM1 query frequency is high, unnecessary interactions may be avoided by avoiding a large number of active queries of TVM 1. In actual implementation, the TVM1 of the first TEE does not need to consume multiple queries of the CPU, so that invalid queries are avoided, and energy consumption is saved.
In addition, the method and the device can avoid the problem that touch screen data can be lost due to the fact that the TVM1 inquires at a low frequency (for example, the frequency is low), and optimize user experience. In actual implementation, the system does not lose effective user touch screen operation, and user experience is improved.
In some possible implementations, after the receiving user's touch screen operation in the first user interface, the method further includes: the second trusted virtual machine TVM determines that the touch screen operation meets a preset touch screen condition; the preset touch screen condition is used for judging whether the touch screen operation is effective touch screen operation or not.
In some possible implementations, the touch screen operation includes inputting a preset area in the first user interface, where the preset area is an area for inputting user privacy information.
Optionally, in the embodiment of the present application, the touch screen operation of the user may be a click input (for example, a single click input or a double click input), a slide input, or any other possible input, which may be specifically determined according to actual use requirements, and the embodiment of the present application is not limited.
In some possible implementations, after the second trusted virtual machine TVM sends the touch screen data to the first trusted virtual machine TVM, the method further includes:
a second trusted virtual machine TVM receives a second user interface sent by the first trusted virtual machine TVM, wherein the second user interface is a trusted user interface generated by the first trusted virtual machine TVM according to the touch screen data;
and the second trusted virtual machine TVM calls a TUI display driver, updates and displays the first user interface to the second user interface, and the TUI display driver is a driver for triggering and displaying the trusted user interface.
In some possible implementations, the primary virtual machine TVM runs a client application CA and is preset with an operating system kernel and a first application programming interface API, where the first API is an interface function between the primary virtual machine and the first trusted virtual machine;
the first trusted virtual machine TVM runs a trusted application TA, and is pre-provided with a second API, a TUI framework and a trusted execution environment kernel, wherein the second API is an interface function for calling the TUI framework;
the second trusted virtual machine TVM is preset to provide a trusted user interface TUI service for the trusted application TA, where the TUI service includes a TUI display service and a TUI touch screen service, the TUI display service is associated with a TUI display driver, and the TUI touch screen service is associated with a TUI touch screen driver.
In some possible implementations, before the second trusted virtual machine TVM displays the first user interface, the method further includes: the client application CA receives the operation of a user on the client application CA; the client application CA calls the first API and initiates a TUI display request to the trusted application TA through an operating system kernel; responding to the TUI display request, and acquiring a first user interface corresponding to the client application CA by the trusted application TA; a trusted application TA sends the TUI display request to the TUI service and the first user interface.
Wherein the second trusted virtual machine TVM displays a first user interface comprising: and responding to the TUI display request sent by the trusted application TA, calling a TUI display driver by the TUI service, and displaying the first user interface.
In some possible implementations, the trusted application TA sending the TUI display request and the first user interface to the TUI service includes: the trusted application TA calls the second API, enters a TUI framework, and then sends the TUI display request and the first user interface to the TUI service through the trusted execution environment kernel and inter-process communication IPC.
In some possible implementations, after the client application CA receives the user operation on the client application CA, before the client application CA invokes the API to initiate a TUI display request to the trusted application TA, the method further includes: responding to the operation of a user on the client application CA, judging whether a to-be-displayed interface of the client application CA contains a user privacy information input area or not; and triggering the electronic equipment to switch from a non-TUI mode to a TUI mode when the interface to be displayed of the client application CA contains a user privacy information input area.
The non-TUI mode is a running mode corresponding to the electronic equipment in the REE environment, and the TUI mode is a running mode corresponding to the electronic equipment in the TEE environment.
In some possible implementations, the client application CA invoking the API to initiate a TUI display request to the trusted application TA, including: when the electronic equipment is switched to the TUI mode, the client application CA calls the API to initiate a TUI display request to the trusted application TA.
Through the scheme after the improvement of the application, the Trusted User Interface (TUI) of the TA corresponding to the CA can quickly acquire touch screen data, so that the service flow can be simplified, invalid inquiry is avoided, energy consumption is saved, the loss of user touch screen operation can be avoided, and the user experience is improved.
In a second aspect, the present application provides a touch screen data processing apparatus based on a trusted execution environment, the apparatus comprising means for performing the method of the first aspect described above. The apparatus may correspond to performing the method described in the first aspect, and the relevant descriptions of the units in the apparatus are referred to the description of the first aspect, which is omitted herein for brevity.
The method described in the first aspect may be implemented by hardware, or may be implemented by executing corresponding software by hardware. The hardware or software includes one or more modules or units corresponding to the functions described above. Such as a processing module or unit, a display module or unit, etc.
In a third aspect, the present application provides an electronic device comprising a processor coupled to a memory, the memory for storing computer programs or instructions, the processor for executing the computer programs or instructions stored by the memory, such that the method of the first aspect is performed. For example, a processor is configured to execute a computer program or instructions stored in a memory, to cause the apparatus to perform the method in the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium having stored thereon a computer program (which may also be referred to as instructions or code) for implementing the method in the first aspect. For example, the computer program, when executed by a computer, causes the computer to perform the method of the first aspect.
In a fifth aspect, the present application provides a chip comprising a processor. The processor is configured to read and execute a computer program stored in the memory to perform the method of the first aspect and any possible implementation thereof. Optionally, the chip further comprises a memory, and the memory is connected with the processor through a circuit or a wire.
In a sixth aspect, the present application provides a system-on-chip comprising a processor. The processor is configured to read and execute a computer program stored in the memory to perform the method of the first aspect and any possible implementation thereof. Optionally, the chip system further comprises a memory, and the memory is connected with the processor through a circuit or a wire.
In a seventh aspect, the present application provides a computer program product comprising a computer program (which may also be referred to as instructions or code) which, when executed by a computer, causes the computer to carry out the method of the first aspect.
It will be appreciated that the advantages of the second to seventh aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
Fig. 1 is an application scenario schematic diagram of a touch screen data processing method based on a trusted execution environment according to an embodiment of the present application;
FIG. 2 is an interface schematic diagram of a touch screen data processing method based on a trusted execution environment according to an embodiment of the present application;
FIG. 3 is a flowchart of a touch screen data processing method based on a trusted execution environment disclosed in the related art;
FIG. 4 is a software module interaction diagram of a touch screen data processing method based on a trusted execution environment disclosed in the related art;
FIG. 5 is an interactive timing diagram of a touch screen data processing method based on a trusted execution environment disclosed in the related art;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 7 is a schematic diagram of a basic software architecture of an electronic device according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a software architecture based on a trusted execution environment according to an embodiment of the present disclosure;
FIG. 9 is a flowchart of a touch screen data processing method based on a trusted execution environment according to an embodiment of the present application;
FIG. 10 is a software module interaction diagram of a touch screen data processing method based on a trusted execution environment according to an embodiment of the present application;
fig. 11 is an interaction timing diagram of a touch screen data processing method based on a trusted execution environment according to an embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The term "and/or" herein is an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. The symbol "/" herein indicates that the associated object is or is a relationship, e.g., A/B indicates A or B.
The terms "first" and "second" and the like in the description and in the claims are used for distinguishing between different objects and not for describing a particular sequential order of objects. In the description of the embodiments of the present application, unless otherwise specified, the meaning of "a plurality of" means two or more, for example, a plurality of processing units means two or more processing units and the like; the plurality of elements means two or more elements and the like.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
The terminal device in the embodiment of the application may be a mobile phone, a personal digital assistant (personal digital assistant, PDA), a tablet computer or other intelligent devices. The terminal equipment can be deployed with an unsafe operating environment and a safe operating environment, wherein the unsafe operating environment is a complex executing environment (rich executable environment, REE) on the terminal equipment, and operating systems such as Android, iOS, windows Phone and the like are operated; the secure operating environment is a trusted execution environment (trust executable environment, TEE) running a secure operating system. The software and hardware resources accessed by the TEE are isolated from the REEs, the software and hardware resources on the terminal equipment can be respectively identified as two execution environment states, the software and hardware resources identified as the safe execution state can only be accessed by the TEE execution environment, and the software and hardware resources identified as the unsafe execution state can be accessed by the two execution environments. The TEE constructs a secure execution environment isolated from the REEs, which may provide a secure execution environment for authorized trusted software.
To facilitate understanding of embodiments of the present application, some of the terms of embodiments of the present application are explained below to facilitate understanding by those skilled in the art.
REE, a complex execution environment, generally refers to an execution environment that does not have specific security functions, such as an Android operating system or an IOS operating system. Note that, the REEs may be referred to as "untrusted execution environments", "normal execution environments", "unsafe execution environments", "rich execution environments", etc., in addition to "complex execution environments", which are not limited by the embodiments of the present application.
TEE, a trusted execution environment, is an independent processing environment with computing and storage functions that provides security and integrity protection. The basic idea is as follows: an isolated memory is allocated for sensitive data in hardware, all computation of the sensitive data is performed in the isolated memory, and other parts of the hardware except for authorized interfaces cannot access information in the isolated memory. Thereby realizing privacy calculation of sensitive data.
In contrast, REEs are an open environment that is vulnerable to attacks, such as theft of sensitive data, theft of mobile payments, and the like; and TEE is a secure area on the central processor that can ensure that sensitive data is processed within an isolated and trusted environment, thereby protecting against software attacks from the REEs. In addition, the TEE may protect the integrity and confidentiality of the TA end-to-end, providing greater processing power and memory space, as compared to other secure execution environments.
The REE+TEE architecture is an architecture that provides services for applications in combination with the REE through the TEE. That is, the TEE is co-present with the REE in the electronic device. Through the support of hardware, the isolation of the TEE and the REE is realized, the security capability is realized, and the software attack easily suffered by the conventional REE side can be resisted. The TEE has a running space of the TEE and defines strict protection measures, so that the security level is higher than that of the REEs, data, software and the like in the TEE can be protected from being attacked by the software, and specific types of security threats are resisted.
The REE+multiple TEE architecture is adopted in the scheme, wherein the multiple TEE is exemplified by a first TEE and a second TEE. Wherein the first TEE runs the security system and the second TEE runs the TUI service. The first TEE and the second TEE cooperate to implement TUI functions, such as TUI display and TUI touch screen functions. The TVM2 of the second TEE (also referred to as the TUI TVM) runs a trusted virtual machine of the TUI server, integrates the driving of the TUI, and provides the functions of TUI display and TUI input to the TVM1 of the first TEE.
OEM TEE: trusted execution environments for original equipment manufacturers. For example, the first TEE takes the role of an OEM TEE.
The TA, i.e. the trusted application, is an application running in the first TEE, capable of providing security services for CAs running outside the first TEE, such as entering passwords, generating transaction signatures, face recognition, etc.
CA, i.e. client application. The CA generally refers to an application running in the REE, but in the case that some TAs call TAs, the TA that actively initiates the call may also act as the CA. The CA may make a call to the TA through a client application programming interface (application programming interface, API) and instruct the TA to perform the corresponding security operation.
TUI touch screen service and touch screen driver: listening for touch screen events from the user. The touch screen event corresponds to touch screen data, and the touch screen data comprises position information and event information.
The system architecture to which various exemplary embodiments of the present application relate is described below. It should be noted first that there is a native TEE based on trust zone (t zone) technology on some platforms, while one or more other TEE systems, i.e. one or more trusted virtual machine TVMs, are running using virtual machine monitor (Hypervisor) technology. On such a similar platform, the OEM TEE (first TEE) is allowed to perform TUI operations through the second TEE, avoiding the problem of deep coupling, integrating TUI drivers on the OEM TEE.
Fig. 1 illustrates a system architecture diagram according to various exemplary embodiments of the present application. As shown in fig. 1, the system architecture includes a host virtual machine Android VM, a trusted virtual machine TVM 1, and a trusted virtual machine TVM 2.
The Android system is operated by the main virtual machine and is used for operating applications in the unsafe world. The trusted virtual machine TVM1 is applied to a first TEE that runs a security system. The trusted virtual machine TVM2 is applied to a second TEE that runs the TUI service, so TVM2 may also be referred to as a TUI TVM.
In the scenario shown in fig. 1, the TUI may be provided through cooperation of multiple trusted virtual machines.
It should be noted that, the TVM2 integrates the TUI device driver, which is a core system for completing the TUI function, and provides the TUI display and TUI input functions to the TVM 1. The TVM1 provides TUI functions for the Android application, but the TVM1 does not directly integrate TUI device drivers, but submits a request for the TUI to the TVM2 through a VM IPC (virtual machine-inter process communication) service, and the TVM2 really completes the functions of TUI display and TUI touch screen.
It should be noted that in the case of more TEE TVMs (e.g., two or more TVMs 1), it may be further extended in this manner, where the TUI requests are handled uniformly by the TUI TVMs, avoiding the need for integrating TUI drivers for each TEE VM.
The function of each module in fig. 1 is explained below.
Client application CA (also referred to as non-secure application): the non-secure application runs in the REE environment, that is, the running environment of the Android operating system. When the application needs to interact with the TEE, the TUI trusted application in the TVM1 is driven to execute related operations through the GP TUI API in fig. 1. Wherein the GP TUI API is a client application programming interface (application programming interface, API) of the trusted execution environment.
TUI trusted application: and providing services for the applications of the unsafe world on the Android side.
GP TUI API: the global platform (global platform) defines the standard APIs for the TUI, and the TEE TVM supports the functionality of providing the TUI through these standard interfaces.
TUI framework: and a TUI framework inside the TEE, so as to complete the core logic of the TUI.
VM IPC client: the TUI framework invokes VM IPC client related modules (mainly libTrustedUI, minkIPC, VMSocket) that can interact with the TUI server running in the TUI TVM.
A TEE kernel: providing access to the TUI server.
TUI server: i.e., the TUI service running on a trusted virtual machine. The TUI server listens for messages from the client and invokes a TUI display driver and a TUI touch screen driver (also referred to as an input driver) upon request.
TUI display driver: and the display output of the TUI screen is carried out.
TUI touch screen drive: listening for touch screen events from the user.
The interaction (recall) process between the various modules is described below in conjunction with the arrow direction shown in fig. 1.
And an application on the Android operating system initiates a request to a TUI trusted application in the TVM1 by calling a client API of the TEE through a system kernel. Then, the TUI trusted application in TVM1 enters the TUI framework through the GP TUI API call. Then, the TUI framework interacts with the TUI server through the VM IPC (inter-process communication) client and the TEE kernel. The TUI framework outputs the graphical interface of the TUI to the TUI server, acquires touch screen operation information (indicating touch screen operation) of a user from the TUI server, and then generates a new graphical interface according to the touch screen operation and updates the new graphical interface to the TUI server. And finally, the TUI server outputs the new graphical interface to the TUI display driver, and displays the new graphical interface through the TUI display driver.
Next, a schematic diagram of the TUI interface and the UI interface is described with reference to fig. 2. As shown in fig. 2 (a), the mobile phone displays an APP login interface, which is a UI interface. As shown in fig. 2 (b), the mobile phone displays a password input interface, which is a TUI interface because the interface can input sensitive information such as a password for a user to log in the APP.
It is appreciated that the TEE provides trusted user interface TUI functionality. By operating on the TUI, the user can safely input the sensitive information, a safe channel is formed among the user input interface, the system and the application program, the sensitive information can be safely transmitted to the application layer, and the sensitive information input by the user is ensured not to be stolen.
The TUI driver presents an input interface to a user through a safe display buffer (secure display buffer), avoids inputting and transmitting sensitive information of the user through a non-safe world (normal world), and directly obtains the input of the user through safe input (safe input/safe touch), so that the aim of protecting the sensitive information input by the user is fulfilled, and the safety of the system is improved.
In the scheme where multiple TEEs cooperatively provide the TUI, touch screen data (TP data) needs to be transferred from one trusted virtual machine TVM to another. The TP data interaction flow between two TVMs may have problems in terms of implementation complexity, interaction times, real-time performance, and the like, resulting in lower interaction efficiency.
Illustratively, fig. 3 shows a basic workflow of transmitting touch screen data between different TVMs in a conventional scheme of the TEE TUI. As shown in fig. 3, it is assumed that there are two trusted virtual machines TVM1 and TVM2, and TVM1 needs to use a thread or a timer to actively poll TVM2 for touch screen data at a preset frequency or period. Through this scheme, after TVM2 loads the touch screen service (step A1), each time TVM1 obtains touch screen data, the following interaction procedure needs to be executed with TVM 2:
step A2: TVM1 actively issues a touch screen data query request to TVM 2.
Step A3: TVM2 returns a status to TVM1 indicating that the request has been received.
Step A4: when TVM2 receives a touch screen input, TVM2 returns touch screen data to TVM 1.
Steps A2 to A4 are circularly performed between TVM1 and TVM 2. Thus, TVM1 actively polls TVM2 for touch screen data according to a preset query period.
Fig. 4 shows a software framework of the flow shown in fig. 3 and a request and transmission flow of touch screen data in the software framework. As shown in fig. 4, software modules such as a listening service, a touch screen service, and a touch screen driver are included in the TVM 2. In the conventional scheme, as shown in fig. 4, steps 1 to 9 are executed according to the following interaction flow, so as to complete transmission of touch screen data:
Step 1: on the TVM2 side, when the listening service listens to the system start-up, the listening service notifies the touchscreen service to load the TUI touchscreen service.
Step 2: the touch screen service loads the touch screen driver.
Step 3: TVM1 acts as a client to initiate a request to query for touch screen data (referred to as a touch screen data query request).
Step 4: the listening service of TVM2 listens for a request, TVM2 returns a touch screen query status to TVM1 indicating that a request has been received.
Step 5: the listening service requests touch screen data from the touch screen service.
It should be noted that, the touch screen service may collect touch screen data through a touch screen driver.
Step 6: the touch screen driver of TVM2 notifies the touch screen service of the touch screen data after the touch screen data is collected.
Step 7: the touch screen service forwards the touch screen data to the listening service.
Step 8: the listening service transmits the touch screen data to the message channel.
Step 9: TVM2 returns touch screen data to TVM1 through the message channel.
The message channel is a channel from TVM2 to TVM1 for transmitting data.
Fig. 5 shows a timing diagram for transmitting touch screen data between two TVMs. As shown in fig. 5, the timing chart includes S1 to S23.
S1: on the TVM2 side, when the listening service listens to the system start-up, the listening service notifies the touchscreen service to load the TUI touchscreen service.
In the conventional scheme, the listening service forwards not only the processing command data and the display-related data, but also needs to be responsible for forwarding the touch screen data.
S2: the TVM2 side touch screen service loads the touch screen driver.
The following steps illustrate that TVM1 actively polls TVM2 for touch screen data according to a preset query period.
S3, the TVM1 informs the system that the TUI mode is switched.
S4, the TVM2 side monitoring service monitors that the system is switched to the TUI mode.
S5, the TVM2 side monitoring service informs the touch screen service that the system is currently in the TUI mode.
S6, triggering the TUI touch screen drive to enable the TUI mode by the touch screen service.
S7, enabling the TUI touch screen drive to enable the TUI mode, and continuously monitoring whether touch screen data exist or not.
S8: TVM1, as a client, initiates a request to query touch screen data.
In a conventional scheme, TVM1 will actively poll TVM2 for touch screen data at a preset frequency or period using a thread or timer, see S9 to S14 and S15 to S23 below.
S9: the TVM2 side listening service monitors the request, and TVM2 returns a touch screen inquiry state to TVM1 indicating that the request has been received.
S10: the TVM2 side monitoring service requests the touch screen service to inquire the touch screen data.
S11: the TVM2 side touch screen service attempts to query the touch screen driver for touch screen data.
S12: the TVM2 side touch screen driver detects whether there is touch screen data.
S13: the TVM2 side touch screen drive does not detect touch screen data.
S14: the TVM2 side touch screen driver notifies the touch screen service that the touch screen data is not detected, then the touch screen service notifies the monitoring service that the touch screen data is not detected, and then the non-detected touch screen data is fed back to the TVM1 through the monitoring service.
It should be noted that, the step S14 is an optional step, that is, in some embodiments, S14 is performed after S13, and in other embodiments, S14 is not performed after S13.
S9 to S14 above are a query period, where TVM1 actively queries TVM2 for touch screen data, but does not query touch screen data.
S15: the TVM1 sends the touch screen data query request to the TVM2 again according to the preset query period.
S16: the TVM2 side listening service listens for a request, and TVM2 returns a touch screen query status to TVM1 indicating that the request has been received.
S17: the TVM2 side monitoring service requests the touch screen service to inquire the touch screen data.
S18: the TVM2 side touch screen service attempts to query the touch screen driver for touch screen data.
S19: the TVM2 side touch screen driver detects whether there is touch screen data.
S20: the TVM2 side touch screen driver detects touch screen data.
S21: the TVM2 side touch screen driver notifies the touch screen service of touch screen data.
S22: the TVM2 side touch screen service forwards the touch screen data to the listening service.
S23: the TVM2 side monitoring service transmits touch screen data to the message channel and returns the touch screen data to the TVM1 through the message channel.
S15 to S23 above are still another query period, and TVM1 actively queries TVM2 for touch screen data, and queries the touch screen data.
In the above manner, the TVM1 actively polls the TVM2 for touch screen data according to a preset query period. It can be seen that the conventional scheme includes a request flow of touch screen data and a transmission flow of touch screen data. The request flow of the touch screen data is as follows: TVM1→listening service of TVM2→touch screen service→touch screen drive. Correspondingly, the transmission flow of the touch screen data is as follows: touch screen driving of TVM2→touch screen service→listening service→message channel→tvm1.
In the conventional scheme described above, TVM1 needs to actively send a request for querying touch screen data to TVM 2. TVM1 is in an active state and TVM2 is in a passive state. This presents a number of problems: the query result may or may not query for touch screen data. When the preset query period is large, that is, the query frequency is high, part of queries in the multiple queries may be invalid, so that invalid queries exist and CPU resources are consumed. When the preset query period is smaller, that is, the query frequency is lower, there may be a case of losing touch screen data. Therefore, the conventional scheme finds that the touch screen data interaction flow between two TVMs in the related art has the problem of more interaction times, lower interaction efficiency and possible loss of touch screen data.
In view of this, the embodiments of the present application provide a touch screen data processing method and an electronic device based on a trusted execution environment, so as to improve user experience by improving the bottom layer of a mobile phone system.
The embodiment of the application can optimize the transmission of the touch screen data stream by improving the current software implementation and workflow.
In this scheme, in a scenario where multiple trusted execution environments TEE (e.g., there are two trusted virtual machines TVM1 and TVM 2) cooperatively provide a trusted user interface TUI, TVM2 pre-loads a touch screen service and a touch screen driver, once the touch screen driver monitors that a user inputs touch screen data in the TUI, TVM2 immediately sends the touch screen data to TVM1 without the TVM1 periodically polling TVM2 for touch screen data. Through the scheme after this application improves, can simplify the business process, acquire touch screen data fast, and can not lose effectual user touch screen operation to can avoid TVM 1's a large amount of initiative inquiry and unnecessary interaction, promote data interaction efficiency, promote user experience.
Referring to fig. 6, a schematic structural diagram of an electronic device according to an embodiment of the present application is provided. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor modules 180 may include a pressure sensor 180A, a gyroscope sensor 180B, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a touch sensor 180K, an ambient light sensor 180L, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. For example, the processor 110 is configured to perform the method of detecting ambient light in the embodiments of the present application.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it may be called directly from memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
External memory 120 is generally referred to as external memory, which in the present embodiment refers to storage other than memory of an electronic device and a cache of a processor, which is generally non-volatile memory.
Internal memory 121, which may also be referred to as "memory," may be used to store computer-executable program code that includes instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ an organic light-emitting diode (OLED). In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 also includes various types of sensors that can convert various physical signals into electrical signals. Illustratively, the pressure sensor 180A is configured to sense a pressure signal, which may be converted to an electrical signal. The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. The air pressure sensor 180C is used to measure air pressure. The magnetic sensor 180D includes a hall sensor. The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc. The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. The bone conduction sensor 180M may acquire a vibration signal.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
For example, in the embodiment of the present application, the touch sensor 180K may detect a click operation of an icon of an application program by a user, and transmit the detected click operation to the application processor, determine that the click operation is used to start or run the application program, and further perform a running operation of the application program.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The above is a specific description of the embodiment of the present application taking the electronic device 100 as an example. It should be understood that the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 100. The electronic device 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device provided in the embodiments of the present application may be a User Equipment (UE), for example, a mobile terminal (e.g., a user mobile phone), a tablet computer, a desktop, a laptop, a handheld computer, a netbook, a personal digital assistant (personal digital assistant, PDA), and other devices.
In addition, an operating system is run on the components. Such as the iOS operating system developed by apple corporation, the Android open source operating system developed by google corporation, the Windows operating system developed by microsoft corporation, etc. An operating application may be installed on the operating system.
The operating system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In this embodiment, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 7 is a software configuration block diagram of the electronic device 100 of the embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, respectively, an application layer (applications), an application framework layer (application framework), an Zhuoyun rows (Android run) and system libraries, and a kernel layer (kernel).
The application layer may include a series of application packages, among other things. For example, the application layer may include applications (applications may be simply referred to as applications) for cameras, gallery, calendar, phone calls, map, navigation, WLAN, bluetooth, music, video, short messages, reading, travel, sports health, smart life, etc., which are not limited in any way by the embodiments of the present application.
Applications in the application layer may be classified into system applications, which may include a desktop, a system user interface (SystemUI), etc., and non-system applications, which may include games, maps, short videos, social applications, shopping applications, reading books, travel, sports health, smart life, etc.
In the embodiment of the application program layer, the application program layer can further comprise a screen sensing module, a service logic processing module, a service presenting module and the like. The screen sensing module, the service logic processing module and the service presentation module may be independent APPs, or may be integrated in different APPs, or may be integrated in the same APP, which is not limited in this application.
The screen sensing module is resident or operates in a low-power consumption mode and has the capability of sensing touch operation of a user on a screen. The screen aware module may detect related events and acquire the status of events from other applications of the application layer or the application framework layer or the system layer or the kernel layer through an application program interface (application programming interface, API). In the embodiment of the application, the screen sensing module is mainly used for monitoring a screen touch event (also called a touch event), and when the screen touch event is monitored, the service logic processing module is notified of the touch event. The screen sensing module may also be used to obtain which Application (APP) the touch object is, i.e. the application package name. That is, the screen sensing module may recognize that the screen is touch-sensitive for a particular application and generate touch screen data.
The business logic processing module (such as a computing engine) has business logic processing capability and logic for acquiring touch screen data and processing the touch screen data. For example, the service logic processing module receives a touch screen event triggered on a user screen and touch screen data sent by the screen sensing module, and judges whether touch screen conditions are met or not, so that whether screen updating display is carried out according to the touch screen data is judged.
And the service presentation module (such as YOYO suggestion) is used for updating the screen display of the mobile phone according to the touch screen data.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. As shown in fig. 7, the application framework layer may include a window manager, a content provider, a view system, a resource manager, a notification manager, etc., an activity manager, a clipboard manager, etc., which the embodiments of the present application do not impose any limitation.
The window manager is used for managing window programs, and can acquire the size of a display screen, judge whether a status bar exists, lock a screen, intercept the screen and the like.
The activity manager is used for managing the life cycle of each application program and the navigation rollback function, and is responsible for the creation of the main thread of the Android, and the maintenance of the life cycle of each application program.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), two-dimensional graphics engine (e.g., SGL), etc. The surface manager is used to manage the display subsystem and provides a fusion of the two-dimensional and three-dimensional layers for the plurality of applications.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
A system architecture according to an embodiment of the present application is described below with reference to fig. 8 based on the hardware architecture of fig. 6 and the software architecture of fig. 7.
Fig. 8 shows an architecture diagram of an electronic device 100 according to an embodiment of the present application, where, as shown in fig. 8, the electronic device 100 includes a hardware platform, and two mutually isolated operating environments running on the hardware platform, that is, a complex execution environment REE and a trusted execution environment TEE, where the two operating environments have independent hardware resources and an operating system respectively. The REEs and TEEs may also be referred to herein as a REE module and a TEE module, respectively. Isolation of hardware resources of the REEs and the TEEs can be achieved through hardware isolation technology, such as trust zone (trust zone) mechanism, and isolation between operating systems corresponding to the REEs and the TEEs and between applications can be achieved through virtualization technology. In this way, the software and hardware resources that the TEE can access are separate from the REEs, and the TEE has very strict restrictions on the data and functions that can be accessed by the application, so that its security level meets certain security requirements, and thus the TEE can be considered a secure execution environment. The REE is a runtime environment outside of the TEE, which may also be referred to as an unsecure execution environment, as opposed to the TEE.
The hardware platform of the electronic device 100 includes, for example, a public peripheral and a trusted peripheral, where the trusted peripheral includes a Secure Element (SE) that can only be controlled and accessed by the TEE, such as a secure memory, a secure clock, a trusted keyboard, etc. A common peripheral is a device that is controllable and accessible by the operating system in the REE.
The applications running in the TEE are referred to as trusted applications (trusted application, TA), and there may be one or more of the TAs (only two TAs are shown as an example). The interface of the TA may be referred to as a Trusted User Interface (TUI). The applications running in the REEs are called client applications (client application, CA), and the number of CAs may be one or more (only two CAs are shown as an example). The Interface of the CA may be referred to as a User Interface (UI). For example, the CA may be specifically an application software of various payment applications, a bank client, a mobile phone shield application, an electronic identity card, a mobile phone POS or other sensitive information input related to account numbers, passwords, etc.; the TA is a security application corresponding to the CA for performing an input operation of sensitive information involved in the CA.
For definitions of terms REE, TEE, CA, TA, etc. related to all embodiments of the present application, reference may also be made to TEE related standards set forth by the global platform organization GP.
The TA running in the TEE may provide security-related functions or services for the CA in the re or other TAs within the TEE. A trusted operating system running in the TEE may provide a TEE internal interface to the TA through which the TA obtains access rights to secure resources and services including, but not limited to: key injection and management, encryption, secure storage, secure clocks, trusted User Interfaces (TUIs), trusted keyboards, etc.
The CA running in the TEE may utilize an external interface provided by the TEE to request security services provided by the TA in the TEE. The operating system running in the REEs (e.g., the terminal operating system such as Windows Phone) may provide a richer feature than the trusted operating system in the TEE, accepting various types of applications, but with less security than the trusted operating system.
For example, in the context of mobile payment, online bank transfer, etc., if input and display of user sensitive information is involved, the CA in the re may invoke the TUI and trusted keyboard services on the TEE side through an external interface provided by the TEE to prevent the application on the TEE side from interception and theft of malicious programs of the user sensitive information.
Architectures based on Linux systems (e.g., operating systems) can also be divided into user mode and kernel mode. The kernel is essentially a piece of software, the hardware resource that controls the computer, and provides the environment in which upper layer applications run. The user mode is the active space of the upper application program, and the execution of the application program must depend on the resources provided by the kernel, including CPU resources, memory resources, I/O resources and the like. In order for the upper layer applications to have access to these resources, the kernel must provide an interface for the upper layer applications to access, i.e., a system call.
It should be understood that the CA is operating in the REE user mode and the TA is operating in the REE user mode. A driver module is deployed in the kernel mode of the REEs (e.g., including a driver interface that provides REEs access to the TEE); a driving module is also deployed in the kernel mode of the TEE; the driver modules in both the REE and the TEE may access the corresponding hardware devices, e.g., the TA may implement a UI in the display screen that displays the CA by invoking the GPU. The driver module of the REE may further include a TUI conversion function or a TUI proxy function. In addition, REE control modules can be deployed in REE, TEE control modules can be deployed in TEE, CA can access TA through REE control modules and TEE control modules, and corresponding security operation is realized. For example, the REE control module may call the drive module on the REE side to drive the hardware device to exit the non-secure working mode (referred to as a non-TUI mode) according to the TUI access request (or TUI display request) of the CA; after the hardware device exits the non-TUI mode, the TEE control module can call the driving module on the TEE side to drive the hardware device to switch into the TUI mode according to the message sent by the REE control module, so that hardware isolation with REE is realized, and then a corresponding TA can be called, so that access, signature, confirmation and the like of the TA by CA and the TUI of the TA in the display screen are realized. The specific functions of the REE driving module, the TEE driving module, the REE control module, the TEE control module and the like can be realized through a processor in the electronic equipment.
In the embodiment of the application, in order to safely interact with a user, safely present information to the user and receive user input through a trusted interface, the TUI and related interfaces are implemented in the TEE, and through the scheme provided by the application, the Trusted User Interface (TUI) of the TA corresponding to the CA can quickly acquire touch screen data.
Although the Android system is taken as an example for explanation, the basic principle of the embodiment of the present application is equally applicable to electronic devices based on iOS, windows, and other operating systems.
The execution body of the touch screen data processing method based on the trusted execution environment provided by the embodiment of the application may be the electronic device, or may be a functional module and/or a functional entity in the electronic device, which can implement the touch screen data processing method based on the trusted execution environment, and the embodiment of the application may be implemented by means of hardware and/or software, and may specifically be determined according to actual use requirements, which is not limited. An exemplary touch screen data processing method based on a trusted execution environment according to an embodiment of the present application is described below with reference to the accompanying drawings by taking an electronic device as an example.
The touch screen data processing method based on the trusted execution environment provided by the embodiment of the application is described below with reference to specific embodiments.
Fig. 9 is a flowchart of a touch screen data processing method based on a trusted execution environment according to an embodiment of the present application. Referring to fig. 9, the method includes steps B1 to B2 described below.
In step B1, after the system is started, TVM2 loads the touch screen service and the touch screen driver.
It should be noted that, the listening service and the touch screen service are located at an application layer of the software architecture, and the touch screen driver is located at a kernel layer of the software architecture. The touch screen service interacts with the touch screen driver through a system call of the Linux standard.
In step B2, when the touch screen driver detects a user touch screen operation, the TVM2 actively transmits touch screen data collected by the touch screen driver to the TVM1.
Thus, once the TVM2 side detects a valid user touch screen operation, TVM2 actively submits touch screen data to TVM1 without requiring TVM1 to request touch screen data from TVM2 multiple times.
An interaction schematic diagram of a touch screen data processing method based on a trusted execution environment according to an embodiment of the present application is described below with reference to fig. 10. Referring to fig. 10, software modules such as a listening service, a touch screen service, and a touch screen driver are included in the TVM 2. The method comprises the following steps 11-14.
In step 11, TVM2 loads the touch screen service after the system is started.
Wherein the listening service accesses the entry of TVM2 as TVM 1. When the monitoring service monitors that the system is started, the monitoring service notifies the TVM2 to load the touch screen service.
Step 12: the touch screen service loads the touch screen driver.
Step 13: when the touch screen driver of the TVM2 detects that a touch screen operation exists, the touch screen driver collects touch screen data.
Wherein the touch screen data comprises touch screen location information (X, Y); and event information such as UP or DOWN. Where DOWN indicates the start of the gesture event and UP indicates the end of the gesture event.
Wherein the touch screen driver transmits touch screen data to the message channel.
Step 14: TVM2 sends touch screen data to TVM1 through the message channel.
The message channel may be understood as a data transmission manner between different TVMs. For example, a message channel may be a channel that communicates inter-processes between multiple trusted virtual machines.
It should be noted that, from the perspective of application services, an underlying socket message is used between TVM2 and TVM1 for inter-process communication. In practical implementation, a manner of implementing transmission by using a socket of a bottom layer between the TVM2 and the TVM1 is different from a manner of implementing transmission by using a socket of a conventional TCP/IP network, that is, a communication mechanism between different modules is different from an implementation mechanism of a socket layer.
The following analysis illustrates the differences between the schemes of the present application and the conventional schemes.
In the conventional scheme as shown in fig. 4 and 5, the conventional scheme includes a request flow of touch screen data and a transmission flow of touch screen data. The request flow of the touch screen data is as follows: TVM1→listening service of TVM2→touch screen service→touch screen drive. Correspondingly, the transmission flow of the touch screen data is as follows: touch screen driving of TVM2→touch screen service→listening service→message channel→tvm1.
In comparison, the scheme of the application only comprises a transmission flow of touch screen data, and does not comprise a request flow of touch screen data. The transmission flow of the touch screen data is as follows: touch screen driving of TVM 2- & gt message channel- & gt TVM1, wherein the transmission flow does not need to be forwarded by touch screen service and monitoring service, and the transmission flow is simplified.
In conventional schemes, the touch screen service is responsible for triggering the drive into and out of TUI mode, while the touch screen service is responsible for forwarding touch screen data. In the scheme of the application, the touch screen service is responsible for triggering the driving to enter the TUI and exit the TUI, but is not responsible for forwarding touch screen data.
It can be seen that in the present application, TVM1 does not need to actively send a request for querying touch screen data to TVM 2. TVM2 is in an active state and is actively sent to TVM1 once TVM2 detects touch screen data.
Next, with reference to fig. 11, a timing chart of a touch screen data processing method based on a trusted execution environment according to an embodiment of the present application will be described in detail. The TVM2 includes software modules such as a listening service, a touch screen service, a TUI touch screen driver, and a TUI display driver. As shown in fig. 11, S101 to S118 are included in the timing chart.
S101, after the system is started, TVM2 loads touch screen service.
When the monitoring service monitors that the system is started, the monitoring service notifies the TVM2 to load the touch screen service.
S102, the touch screen service loads a touch screen drive.
Wherein, after the touch screen service loads the touch screen drive, the touch screen service is responsible for triggering the touch screen drive to enter the TUI mode or exit the TUI mode.
S103, the TVM1 notifies the system that the TUI mode has been switched.
S104, the TVM2 side monitoring service monitors that the system is switched to the TUI mode.
S105, the TVM2 side monitoring service informs the touch screen service that the system is currently in the TUI mode.
S106, triggering the TUI touch screen drive to enable the TUI mode by the touch screen service.
S107, the TUI touch screen drive enables the TUI mode and continuously monitors whether touch screen data exists.
S108, the TUI touch screen driver monitors touch screen data.
S109, the TVM2 side TUI touch screen driver sends touch screen data to the TVM1 through a message channel.
S110, the TVM1 receives the touch screen data and generates a graphical interface according to the touch screen data.
S111, TVM1 sends the graphical interface to TVM2 through the message channel.
And S112, the TVM2 side monitoring service receives the graphical interface and forwards the graphical interface to the TUI display driver.
S113, the TVM2 side TUI display driver displays the image interface.
Therefore, under the condition that the TUI touch screen driver starts the TUI mode, the TUI touch screen driver continuously monitors whether touch screen data exist, and once the touch screen data exist, the TVM2 actively transmits the touch screen data to the TVM1, so that effective user touch screen operation can be timely detected, the touch screen data cannot be lost, and user experience is improved.
The TUI display driving described in steps S114 to S118 below exits the TUI mode.
S114, TVM1 notifies the system that the TUI mode has been exited.
S115, the TVM2 side listening service listens that the system has exited the TUI mode.
S116, the TVM2 side monitoring service informs the touch screen service that the system has exited the TUI mode.
S117, the touch screen service triggers the TUI touch screen driver to exit the TUI mode.
S118, the TUI touch screen driver exits from the TUI mode, and at this time, whether touch screen data exists or not is not monitored.
In the embodiment of the application, TVM1 is responsible for completing the generation of the graphical interface according to the APP specification information, and TVM2 is responsible for displaying the graphical interface generated by TVM1. The touch screen driver of the TVM2 collects touch screen data and sends the touch screen data to the TVM1 through a message channel. The touch screen data may include touch screen position information (X, Y) and event information such as UP/DOWN. Where DOWN indicates a gesture event beginning and UP indicates an end.
After the TVM1 receives the touch screen data of the TVM2, the TVM1 further responds according to the touch screen position information (X, Y), UP/DOWN and other event information, for example, determines the position of the keyboard clicked by the user according to the touch screen position information (X, Y), and determines the content to be displayed in the input box of the TUI according to the determination result. Wherein the content to be displayed may be a combination of letters, numbers and/or characters. The content to be displayed may be any of the following: user name, account number and password, bank account number.
In actual implementation, the loading of the touch screen service and the touch screen driver is finished in TVM2 of the second TEE in advance. Once the screen display content of the electronic device is switched to the TUI interface (enters the TUI mode), the touch screen driver detects whether a valid user touch screen operation exists in real time, and when the touch screen driver detects the valid user touch screen operation, the touch screen driver acquires touch screen data and the TVM2 actively transmits the touch screen data to the TVM1.
Wherein, TVM1 of the first TEE only needs to monitor touch screen data from TVM2 of the second TEE, and does not need to actively poll TVM2 of the second TEE. The embodiment of the application can optimize the transmission of the touch screen data stream by improving the current software implementation and workflow, and solves the problems faced by the conventional scheme.
Specifically, the scheme can simplify the interaction flow of touch screen data between two TVMs, and the transmission of touch screen data can be completed through one active notification. In actual implementation, the TVM1 of the first TEE does not need to interact with the TVM2 of the second TEE for multiple times, so that the implementation flow is simplified.
On one hand, the method and the device can avoid a large number of active queries of the TVM 1. Wherein, because part of queries may be invalid when the TVM1 query frequency is high, unnecessary interactions may be avoided by avoiding a large number of active queries of TVM 1. In actual implementation, the TVM1 of the first TEE does not need to consume multiple queries of the CPU, so that invalid queries are avoided, and energy consumption is saved.
On the other hand, the method and the device can avoid the problem that touch screen data can be lost due to the fact that the TVM1 inquires at a low frequency (for example, the frequency is low), and optimize user experience. In actual implementation, the system does not lose effective user touch screen operation, and user experience is improved.
Therefore, through the improved scheme, the service flow can be simplified, invalid inquiry is avoided, energy consumption is saved, the touch screen operation of a user can be prevented from being lost, and user experience is improved.
It should be noted that the improved scheme of the present application is applicable to the following scenarios: the TEE directly operates in a trust zone (trust zone) environment, and is also suitable for TVM environments such as hypervisors.
According to the scheme, in a scene that a plurality of trusted execution environments TEEs (such as the two trusted virtual machines TVM1 and TVM 2) cooperatively provide a trusted user interface TUI, the TVM2 loads a touch screen service and a touch screen driver in advance, and once the touch screen driver monitors that a user inputs touch screen data in the TUI, the TVM2 immediately sends the touch screen data to the TVM1 without periodically polling the TVM2 for the touch screen data. Through the scheme after this application improves, can simplify the business process, acquire touch screen data fast, and can not lose effectual user touch screen operation to can avoid TVM 1's a large amount of initiative inquiry and unnecessary interaction, promote data interaction efficiency, promote user experience.
It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware and/or software modules that perform the respective functions. The steps of an algorithm for each example described in connection with the embodiments disclosed herein may be embodied in hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation is not to be considered as outside the scope of this application.
The present embodiment may divide the functional modules of the electronic device according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules described above may be implemented in hardware. It should be noted that, in this embodiment, the division of the modules is schematic, only one logic function is divided, and another division manner may be implemented in actual implementation.
In the case of dividing the respective function modules with the respective functions, the electronic apparatus may also be divided to include a display unit, a detection unit, a processing unit, and the like. It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
The electronic device provided in this embodiment is configured to execute the touch screen data processing method based on the trusted execution environment, so that the same effect as that of the implementation method can be achieved.
In case an integrated unit is employed, the electronic device may comprise a processing module, a storage module and a communication module. The processing module can be used for controlling and managing the actions of the electronic equipment; the storage module may be used to support the electronic device to execute stored program code, data, etc.; and the communication module can be used for supporting the communication between the electronic device and other devices.
Wherein the processing module may be a processor or a controller. Which may implement or perform the various exemplary logic blocks, modules, and circuits described in connection with this disclosure. A processor may also be a combination that performs computing functions, e.g., including one or more microprocessors, digital signal processing (digital signal processing, DSP) and microprocessor combinations, and the like. The memory module may be a memory. The communication module can be a radio frequency circuit, a Bluetooth chip, a Wi-Fi chip and other equipment which interact with other electronic equipment.
In one embodiment, when the processing module is a processor and the storage module is a memory, the electronic device according to this embodiment may be a device having the structure shown in fig. 6.
The present application also provides a chip coupled to a memory for reading and executing a computer program or instructions stored in the memory to perform the methods of the embodiments described above.
The present application also provides an electronic device comprising a chip for reading and executing a computer program or instructions stored in a memory, such that the methods in the embodiments are performed.
The present embodiment further provides a computer readable storage medium, where computer instructions are stored, which when executed on an electronic device, cause the electronic device to perform the above-mentioned related method steps to implement the touch screen data processing method based on the trusted execution environment in the above-mentioned embodiment.
The present embodiment also provides a computer program product, where the computer readable storage medium stores a program code, and when the computer program product runs on a computer, the computer program product causes the computer to execute the above related steps, so as to implement the touch screen data processing method based on the trusted execution environment in the above embodiment.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component, or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip executes the touch screen data processing method based on the trusted execution environment in the method embodiments.
The electronic device, the computer readable storage medium, the computer program product or the chip provided in this embodiment are used to execute the corresponding method provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding method provided above, and will not be described herein.
The embodiment of the present application is not particularly limited to the specific structure of the execution body of the method provided in the embodiment of the present application, as long as touch screen data processing can be performed in the method provided in the embodiment of the present application by executing a program in which codes of the method provided in the embodiment of the present application are recorded. For example, the execution body of the method provided in the embodiment of the present application may be an electronic device, or a functional module in the electronic device that can call a program and execute the program.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Furthermore, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application, or the part contributing to the prior art, or the part of the technical solution, may be embodied in the form of a computer software product stored in a storage medium, the computer software product comprising several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present application. The foregoing storage medium may include, but is not limited to: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. The display method of the trusted user interface is applied to electronic equipment, wherein the electronic equipment comprises a main virtual machine TVM, a first trusted virtual machine TVM and a second trusted virtual machine TVM, the main virtual machine TVM is applied to a complex execution environment REE, and the first trusted virtual machine TVM and the second trusted virtual machine TVM are applied to a trusted execution environment TEE, and is characterized in that the method comprises the following steps:
when an operating system of the electronic equipment is started, loading a touch screen driver for a Trusted User Interface (TUI) by the TVM, and monitoring whether touch screen data exist in the TUI by the touch screen driver;
the second trusted virtual machine TVM displays a first user interface, wherein the first user interface is a trusted user interface;
when the touch screen driver monitors that a user inputs touch screen data in the trusted user interface TUI, the second trusted virtual machine TVM sends the touch screen data to the first trusted virtual machine TVM;
the second trusted virtual machine TVM receives a second user interface sent by the first trusted virtual machine TVM, wherein the second user interface is a trusted user interface generated by the first trusted virtual machine TVM according to the touch screen data;
The second trusted virtual machine TVM is updated from displaying the first user interface to displaying the second user interface.
2. The method according to claim 1, wherein the method further comprises:
when the operating system of the electronic equipment is switched to a TUI mode, triggering the touch screen driver to start the TUI mode by the TVM, wherein the TUI mode is a running mode corresponding to the electronic equipment in the TEE environment;
after the TUI mode is enabled by the touch screen driver, the second trusted virtual machine TVM continuously monitors whether touch screen data exists or not through the touch screen driver.
3. The method according to claim 1, wherein the method further comprises:
when the operating system of the electronic equipment exits the TUI mode, the second trusted virtual machine TVM triggers the touch screen driver to exit the TUI mode;
and after exiting the TUI mode, the touch screen driver stops monitoring whether touch screen data exists or not.
4. A method according to any one of claims 1 to 3, further comprising:
the second trusted virtual machine TVM receives touch screen operation of a user in the first user interface;
And responding to the touch screen operation of a user, calling the touch screen driver by the second trusted virtual machine TVM, and collecting the touch screen data.
5. The method of claim 4, wherein after the receiving user's touch screen operation in the first user interface, the method further comprises:
the second trusted virtual machine TVM determines that the touch screen operation meets a preset touch screen condition;
the preset touch screen condition is used for judging whether the touch screen operation is effective touch screen operation or not;
the touch screen operation comprises input of a preset area in the first user interface, wherein the preset area is an area for inputting user privacy information.
6. The method of any of claims 1 to 5, wherein the second trusted virtual machine TVM updates the display from displaying the first user interface to the second user interface, comprising:
and the second trusted virtual machine TVM calls a TUI display driver, updates and displays the first user interface to the second user interface, and the TUI display driver is a driver for triggering and displaying the trusted user interface.
7. The method according to any of the claims 1 to 6, characterized in that the host virtual machine TVM is running a client application CA and is pre-set with an operating system kernel and a first application programming interface API, the first API being an interface function between the host virtual machine and the first trusted virtual machine;
the first trusted virtual machine TVM runs a trusted application TA, and is pre-provided with a second API, a TUI framework and a trusted execution environment kernel, wherein the second API is an interface function for calling the TUI framework;
the second trusted virtual machine TVM is preset to provide a Trusted User Interface (TUI) service for the Trusted Application (TA), wherein the TUI service comprises a TUI display service and a TUI touch screen service, the TUI display service is associated with a TUI display driver, and the TUI touch screen service is associated with a TUI touch screen driver;
wherein, before the second trusted virtual machine TVM displays the first user interface, the method further comprises: the client application CA receives the operation of a user on the client application CA; the client application CA calls the first API and initiates a TUI display request to the trusted application TA through an operating system kernel; responding to the TUI display request, and acquiring a first user interface corresponding to the client application CA by the trusted application TA; the trusted application TA sending the TUI display request to the TUI service and the first user interface;
Wherein the second trusted virtual machine TVM displays a first user interface comprising: and responding to the TUI display request sent by the trusted application TA, calling a TUI display driver by the TUI service, and displaying the first user interface.
8. The method of claim 7, wherein the trusted application TA sending the TUI display request and the first user interface to the TUI service comprises:
the trusted application TA calls the second API, enters a TUI framework, and then sends the TUI display request and the first user interface to the TUI service through the trusted execution environment kernel and inter-process communication IPC.
9. The method according to claim 7 or 8, wherein after the client application CA receives a user operation on the client application CA, before the client application CA invokes the first API to initiate a TUI display request to the trusted application TA, the method further comprises:
responding to the operation of a user on the client application CA, judging whether a to-be-displayed interface of the client application CA contains a user privacy information input area or not;
triggering the electronic equipment to switch from a non-TUI mode to a TUI mode when a user privacy information input area is contained in a to-be-displayed interface of the client application CA;
The non-TUI mode is a running mode corresponding to the electronic equipment in the REE environment.
10. The method of claim 9, wherein the client application CA invoking the first API to initiate a TUI display request to the trusted application TA comprises:
when the operating system of the electronic device is switched to the TUI mode, the client application CA calls the first API to initiate the TUI display request to the trusted application TA.
11. The method according to any one of claims 1 to 10, wherein the second trusted virtual machine TVM sending the touch screen data to the first trusted virtual machine TVM comprises:
the second trusted virtual machine TVM sends the touch screen data to the first trusted virtual machine TVM through a first message channel;
the first message channel is a mode of realizing data transmission through socket.
12. The method according to any one of claims 1 to 11, further comprising:
when the operating system of the electronic device is started, the second trusted virtual machine TVM loads a TUI display driver.
13. The method according to any one of claims 1 to 12, wherein the touch screen data includes location information for indicating a location of a touch screen operation and event information for indicating an event type of the touch screen.
14. An electronic device comprising a processor coupled to a memory, the processor for executing a computer program or instructions stored in the memory to cause the electronic device to implement the method of any one of claims 1-13.
15. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program which, when run on an electronic device, causes the electronic device to perform the method of any one of claims 1 to 13.
CN202311511271.9A 2022-07-29 2022-07-29 Trusted user interface display method, trusted user interface display equipment and storage medium Pending CN117744068A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311511271.9A CN117744068A (en) 2022-07-29 2022-07-29 Trusted user interface display method, trusted user interface display equipment and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202311511271.9A CN117744068A (en) 2022-07-29 2022-07-29 Trusted user interface display method, trusted user interface display equipment and storage medium
CN202210906314.2A CN116049813B (en) 2022-07-29 2022-07-29 Touch screen data processing method, device and storage medium based on trusted execution environment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202210906314.2A Division CN116049813B (en) 2022-07-29 2022-07-29 Touch screen data processing method, device and storage medium based on trusted execution environment

Publications (1)

Publication Number Publication Date
CN117744068A true CN117744068A (en) 2024-03-22

Family

ID=86124184

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202311511271.9A Pending CN117744068A (en) 2022-07-29 2022-07-29 Trusted user interface display method, trusted user interface display equipment and storage medium
CN202210906314.2A Active CN116049813B (en) 2022-07-29 2022-07-29 Touch screen data processing method, device and storage medium based on trusted execution environment

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202210906314.2A Active CN116049813B (en) 2022-07-29 2022-07-29 Touch screen data processing method, device and storage medium based on trusted execution environment

Country Status (1)

Country Link
CN (2) CN117744068A (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101620058B1 (en) * 2009-11-23 2016-05-24 삼성전자주식회사 Apparatus for switching screen between virtual machines and method thereof
CN106845285B (en) * 2016-12-28 2023-04-07 北京握奇智能科技有限公司 Method for realizing service by matching TEE system and REE system and terminal equipment
CN107844243B (en) * 2017-11-09 2020-06-16 新华三云计算技术有限公司 Cloud desktop touch screen implementation method and device
CN109840436A (en) * 2017-11-29 2019-06-04 阿里巴巴集团控股有限公司 The application method and device of data processing method, trusted user interface resource data
CN109992315B (en) * 2019-04-09 2022-03-25 Oppo广东移动通信有限公司 Touch screen control method and device, terminal and storage medium
CN112817697A (en) * 2021-02-09 2021-05-18 中国银联股份有限公司 Virtualization system and method for trusted execution environment and device calling method

Also Published As

Publication number Publication date
CN116049813B (en) 2023-10-20
CN116049813A (en) 2023-05-02

Similar Documents

Publication Publication Date Title
CN109960582B (en) Method, device and system for realizing multi-core parallel on TEE side
US9280655B2 (en) Application authentication method and electronic device supporting the same
EP3913516A1 (en) File access authority authentication method and electronic device
US11042398B2 (en) System and method for guest operating system using containers
CN104318182A (en) Intelligent terminal isolation system and intelligent terminal isolation method both based on processor safety extension
CN113821803B (en) Security architecture system, security management method and computing device
JP5566309B2 (en) Information processing apparatus and information processing method
CN108235767B (en) Payment application isolation method and device and terminal
CN109992965B (en) Process processing method and device, electronic equipment and computer readable storage medium
US20140258734A1 (en) Data security method and electronic device implementing the same
CN109416800A (en) A kind of authentication method and mobile terminal of mobile terminal
EP4030680A1 (en) Application processing method and related product
US11948233B2 (en) Image display method and electronic device
CN115544586B (en) Secure storage method for user data, electronic device and storage medium
CN116049813B (en) Touch screen data processing method, device and storage medium based on trusted execution environment
KR102411608B1 (en) system for secure network and data processing method thereof
CN114826785B (en) Dynamic protection method, system-on-chip, electronic device and medium
CN107851140A (en) Utilize the method and device of pressure touch generation password
CN113673676B (en) Electronic equipment and implementation method of neural network model, system-on-chip and medium
CN106850928B (en) Incoming call and SMS processing method, device and mobile terminal
CN106874746B (en) Application program calling method and device and mobile terminal
WO2019128545A1 (en) Process handling method, and electronic device and computer-readable storage medium
US20240015156A1 (en) Electronic device for controlling access to device resource and operation method thereof
EP4209943A1 (en) Method for performing biometric authentication when multiple application interfaces are simultaneously displayed
WO2022143136A1 (en) Password reset method and apparatus, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination