CN116880941A - Method, computer system, and storage medium for cross-application view embedding - Google Patents

Method, computer system, and storage medium for cross-application view embedding Download PDF

Info

Publication number
CN116880941A
CN116880941A CN202310807468.0A CN202310807468A CN116880941A CN 116880941 A CN116880941 A CN 116880941A CN 202310807468 A CN202310807468 A CN 202310807468A CN 116880941 A CN116880941 A CN 116880941A
Authority
CN
China
Prior art keywords
application
control
screen
double
display object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310807468.0A
Other languages
Chinese (zh)
Inventor
雷金亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Weilai Automobile Technology Anhui Co Ltd
Original Assignee
Weilai Automobile Technology Anhui Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Weilai Automobile Technology Anhui Co Ltd filed Critical Weilai Automobile Technology Anhui Co Ltd
Priority to CN202310807468.0A priority Critical patent/CN116880941A/en
Publication of CN116880941A publication Critical patent/CN116880941A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present application relates to the field of cross-application view embedding, and more particularly to a method for cross-application view embedding, a computer system implementing the method, and a computer storage medium implementing the method. The method comprises the following steps: responding to the starting of a host application, and transmitting a surface object created by the host application to an embedded application, wherein the embedded application realizes user interface management based on a fragment control; creating a virtual display object and a double-screen display object in the embedded application, and drawing a view of the embedded application on the surface layer object through the virtual display object and the double-screen display object to display the view of the embedded application in the host application, wherein the creation of the double-screen display object comprises: and taking the double-screen different display object as a host of the fragment control, and loading and managing the fragment control by utilizing the double-screen different display object.

Description

Method, computer system, and storage medium for cross-application view embedding
Technical Field
The present application relates to the field of cross-application view embedding, and more particularly to a method for cross-application view embedding, a computer system implementing the method, and a computer storage medium implementing the method.
Background
Currently, a desktop framework architecture of a vehicle system often requires a cross-application view embedding scheme to embed views of other applications (e.g., map applications) into a desktop application view. However, in the current cross-application View embedded scheme, since the host is not an Activity component, only a View control for loading the embedded application is usually supported for display use, but a Fragment (Fragment) control of the android standard cannot be loaded, and a widget dependent on the Activity component such as a Dialog window (Dialog) control, a popup window (popup window) control, and the like cannot be popped up.
Disclosure of Invention
To solve or at least alleviate one or more of the above problems, the following solutions are provided. Embodiments of the present application provide a method for cross-application view embedding, a computer system implementing the method, and a computer storage medium implementing the method that enable standard fragment controls to be loaded when a view is embedded to enable more dynamic and flexible user interface management.
According to a first aspect of the present application there is provided a method for cross-application view embedding, the method comprising the steps of: responsive to the starting of a host application, transmitting a surface object created by the host application to an embedded application, wherein the embedded application realizes user interface management based on a fragment control; creating a virtual display object and a double-screen display object in the embedded application, and drawing a view of the embedded application on the surface layer object through the virtual display object and the double-screen display object so as to display the view of the embedded application in the host application, wherein the creation of the double-screen display object comprises: and taking the double-screen different display object as a host of the fragment control, and loading and managing the fragment control by utilizing the double-screen different display object.
Alternatively or additionally to the above, a method according to an embodiment of the application further comprises: and adding a texture view control in the host application and setting parameters of the texture view control, wherein the texture view control designates a view display area of the embedded application, and the surface object is embedded in the texture view control.
Alternatively or additionally to the above, a method according to an embodiment of the application further comprises: in response to launching of the host application, parameters of the texture view control are sent to the embedded application, wherein the parameters include a width, a height, and a resolution of a view display area of the embedded application in the host application.
Alternatively or additionally to the above, in a method according to an embodiment of the present application, sending parameters of the texture view control to the embedded application and passing the surface object created by the host application to the embedded application includes: responding to a calling request of the host application, and binding the service of the host application by the embedded application; and after successful binding, the host application sends parameters of the surface object and the texture view control to the embedded application.
Alternatively or additionally to the above, in a method according to an embodiment of the present application, creating the virtual display object and the two-screen heterodisplay object in the embedded application includes: creating the virtual display object based on the object and parameters of the texture view control, and creating a double-screen heteromorphic object associated with the virtual display object by using the virtual display object; drawing the view of the embedded application on the surface object through the virtual display object and the double-screen display object comprises: rendering a view of the embedded application into the virtual display object using the dual-screen highlighted object, and the virtual display object drawing rendered content onto a surface object provided by the host application.
Alternatively or additionally to the above solution, in a method according to an embodiment of the present application, taking the dual-screen object as a host for the segment control, and loading and managing the segment control by using the dual-screen object includes: creating a segment controller object and designating the double-screen heteromorphic object as a host of the segment control in the creation process; and enabling the life cycle state of the fragment control to be consistent with the life cycle state of the double-screen different display object by realizing a corresponding life cycle function.
Alternatively or additionally to the above, in the method according to an embodiment of the present application, making the lifecycle states of the segment controls coincide with the lifecycle states of the two-screen highlighting object by implementing corresponding lifecycle functions includes: when the double-screen highlighted object is initialized, correspondingly executing the initialization of the fragment control; correspondingly changing the fragment control from the invisible state to the visible state when the double-screen highlighted object is changed from the invisible state to the visible state; when the double-screen different display object is completely invisible, correspondingly making the fragment control completely invisible; and correspondingly executing the destruction of the fragment control when the double-screen highlighted object is destroyed.
Alternatively or additionally to the above, a method according to an embodiment of the application further comprises: and transmitting the user interaction event received by the texture view control to the embedded application in a cross-process manner.
Alternatively or additionally to the above, in a method according to an embodiment of the present application, the cross-procedural delivery of the user interaction event received by the texture view control to the embedded application comprises: and transmitting the type and the coordinates of the user interaction event to the double-screen different-display object of the embedded application in response to the user interaction event received by the texture view control.
According to a second aspect of the present application, there is provided a computer system comprising: a memory; a processor; and a computer program stored on the memory and executable on the processor, the execution of the computer program causing any one of the methods according to the first aspect of the application to be performed.
According to a third aspect of the present application there is provided a computer storage medium comprising instructions which, when executed, perform any of the methods according to the first aspect of the present application.
According to the scheme for cross-application view embedding, which is provided by one or more embodiments of the application, the view of the embedded application can be displayed on the interface of the host application, and meanwhile, the double-screen different-display object is used for replacing the Activity component to load and manage the fragment control of the embedded application, so that the problem that the fragment control cannot be loaded, the windows such as a Dialog control, a popup window control and the like cannot be popped up due to the fact that the host is not the Activity component is avoided, and more dynamic and flexible user interface management is realized.
Drawings
The foregoing and/or other aspects and advantages of the present application will become more apparent and more readily appreciated from the following description of the various aspects taken in conjunction with the accompanying drawings in which like or similar elements are designated with the same reference numerals. In the drawings:
FIG. 1 is a schematic flow diagram of a method 10 for cross-application view embedding in accordance with one or more embodiments of the application; and
FIG. 2 is a schematic block diagram of a computer system 20 in accordance with one or more embodiments of the application.
Detailed Description
The following description of the specific embodiments is merely exemplary in nature and is in no way intended to limit the disclosed technology or the application and uses of the disclosed technology. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, or the following detailed description.
In the following detailed description of embodiments, numerous specific details are set forth in order to provide a more thorough understanding of the disclosed technology. It will be apparent, however, to one skilled in the art that the disclosed techniques may be practiced without these specific details. In other instances, well-known features have not been described in detail so as not to unnecessarily complicate the description.
Terms such as "comprising" and "including" mean that in addition to having elements and steps that are directly and explicitly recited in the description, the inventive aspects also do not exclude the presence of other elements and steps not directly or explicitly recited. The terms such as "first" and "second" do not denote the order of units in terms of time, space, size, etc. but rather are merely used to distinguish one unit from another.
Android introduces the concept of Fragment controls in 3.0, which is primarily intended for use on large screen devices (e.g., tablet computers) to support more dynamic and flexible User Interface (UI) designs. It will be appreciated that a large screen has more room to place more UI objects and that more interactions between these UI objects can occur. The Fragment control represents a behavior or a part of a UI in the Activity component, which may be a UI Fragment embedded in the Activity component, so as to achieve more reasonable and full use of the space of the large screen. The Fragment control may contain a layout, also have its own lifecycle, may receive input events pertaining to it, and may be added and deleted during the operation of the Activity component. It should be noted that, in general, a host of the Fragment control is an Activity component, and other host types cannot directly load the standard Fragment control. In order to meet the function of a desktop framework of a Launcher (namely, a desktop UI of an android system) of the system, the method and the device creatively utilize a double-screen different display Presentation object to replace an Activity component to load and manage Fragment controls of embedded applications. Hereinafter, various exemplary embodiments according to the present application will be described in detail with reference to the accompanying drawings.
FIG. 1 is a schematic flow diagram of a method 10 for cross-application view embedding in accordance with one or more embodiments of the application. The method 10 may be adapted to display a view of an embedded application of a system (e.g., an Android system) in a designated area of a host application.
It should be noted that, the host application described herein refers to an application that provides an embedded container, and an embedded application refers to an application that provides embedded content and displays the embedded content into the host application. Illustratively, the host application may be a host application, and the embedded application may be a map application, a navigation application, or the like. It should be noted that, the embedded application in the present application implements user interface management based on the Fragment control, that is, the embedded application itself implements user interface management based on the Activity component.
As shown in fig. 1, in step S110, in response to the start of the host application, a surface object created by the host application is transferred to the embedded application.
The Surface object is a key object used for drawing graphics in Android, and provides an off-screen buffer area which can be used for drawing graphics on a screen or playing video. Illustratively, the Surface object may be explicitly presented on the system screen through a Surface view or texture view TextureView control.
Optionally, a texture view textview control may be added to the host application and parameters of the texture view textview control set before performing step S110. The texture view TextureView control is a view control supporting texture rendering, and can be used in OpenGL ES scene rendering and can also be used for displaying multimedia applications such as playing video. Compared with the surface view control, the texture view TextureView control is more flexible, can be freely positioned, zoomed and rotated in the layout, and does not block the UI threads. The texture view textview control added in the host application specifies the view display area of the embedded application, and the Surface layer Surface object created in step S110 is embedded in the texture view textview control. In one embodiment, the host application adds the texture view TextureView control in the host application display area of the system screen, and sets the size and placement position of the control, where the control represents the area where the embedded application will display the picture. The texture view TextureView control, after being created, will have Surface objects and information of width, height, density, etc. for delivery to the embedded application.
Optionally, after the texture view TextureView control is added, parameters of the Surface layer Surface object and the texture view TextureView control may be sent to the embedded application in response to the host application being started, wherein the parameters include information such as width, height and resolution of a view display area of the embedded application in the host application.
Optionally, in response to a call request of the host application, the embedded application binds a service of the host application; after successful binding, the host application sends parameters of the Surface object and texture view TextureView control to the embedded application. Illustratively, a host service (HostService) exists in the host application, an embedded page service (pageededservice) exists in the embedded application, and when the host application is started, a system method is called to start the pageededservice to inform the embedded application that the host application is started, and then the pageededservice needs to bind the HostService to establish a connection with the host application. After successful binding, the embedded application informs the host to transmit information such as a Surface object of the TextureView control and parameters of the TextureView control back to the embedded application by calling a defined showsurface interface, and the host application returns the information to the embedded application by using a callback function carried in the showsurface interface.
In step S120, a virtual display object and a dual-screen display object are created in the embedded application, and a view of the embedded application is drawn on the surface layer object through the virtual display object and the dual-screen display object to display the view of the embedded application in the host application. Optionally, the creating of the virtual display object and the dual-screen different display Presentation object specifically includes: creating a virtual display object based on parameters of the Surface object and the texture view TextureView control; and creating a dual-screen different Presentation object associated with the virtual display object by using the virtual display object. Optionally, drawing the view of the embedded application on the Surface layer Surface object through the virtual display object and the dual-screen different display Presentation object includes: rendering the view of the embedded application into a virtual display object by utilizing the double-screen different display Presentation object; and virtually displaying the virtual display object draws the rendered content onto a Surface object provided by the host application.
The virtual display is a virtual display that can render various resources such as images and output the rendered content in the form of streaming media. For example, after receiving information such as parameters of a Surface object and/or a texture view TextureView control transmitted by a host application, the embedded application may transfer the information into the system-class DisplayManager by calling a createVirtualDisplay method, so as to realize creation of a virtual display VirtualDisplay object.
The dual-screen different display Presentation object is a multi-display supporting scheme provided by Android, and is designed for presenting multimedia content on an external display or projector. The application utilizes the double-screen different display Presentation object to render the view of the embedded application to the virtual display virtual object. For example, the virtual display object may be created, and the virtual display object may be imported when the dual-screen display Presentation object is created. After the dual-screen different display Presentation object is created, the display show () method of the Presentation object can be directly called, and at this time, the system starts the Presentation object and renders it into the virtual display object, so that the virtual display object draws and displays the rendered content onto the TextureView control in the specified area in the host application.
It should be noted that in the prior art, loading and management of the View control can only be realized through the double-screen different display Presentation object, but the Fragment control of the android standard cannot be loaded. The loading of the Fragment control in the android system depends on an Activity component, and the Activity component is equivalent to the host of the Fragment control. In this regard, the present application proposes to modify the dual-screen different Presentation object, that is, in the process of creating the dual-screen different Presentation object in step S120, the dual-screen different Presentation object is used as a host for the Fragment control, and the dual-screen different Presentation object is used to load and manage the Fragment control (for example, loading a page and managing a life cycle of the page).
Optionally, the modification process of the dual-screen different display Presentation object includes: a Fragment controller FragmentController object is created and a two-screen different Presentation object is designated as a host of a Fragment control in the creation process. In order to manage the Fragment control, a Fragment manager FragmentManager object needs to be used, which can be obtained by setting a getSupportFragmentManager method, for example, in the creation process of the Presentation object. In addition, in the creation process of the Fragment controller object, the current Presentation object is used as a host of the Fragment control by returning the current Presentation object in the onGetHost () method, and then the layout filler layoutinflate object of the Presentation object can be provided to the Fragment control by executing the ongetlayoutinflate () method.
Optionally, the modification process of the dual-screen different display Presentation object further includes: by realizing the corresponding life cycle function, the life cycle state of the Fragment control is consistent with the life cycle state of the double-screen different display Presentation object. In particular, in order to achieve the lifecycle distribution of the Presentation object to the Fragment control, when the lifecycle of the Presentation object is executed to onCreate (), onStart (), onstate (), dis-placement (), mfragmentStart (), mfragmentrstart (), mfragmentart distactart (), mfragmentart distactyView (), mfragmentart distactart distactyDestroView (), respectively, the lifecycle management of the Fragment control by the host Presentation object can be completed. Specifically, when the two-screen differently displayed Presentation object is initialized (e.g., an onCreate () method is performed), initialization of the Fragment control is correspondingly performed (e.g., an mfragments.dispatchCreate () method is performed); when the two-screen differently displayed Presentation object changes from invisible to visible (e.g., performs the onStart () method), the Fragment control is correspondingly changed from invisible to visible (e.g., performs the mfragments.dispatchstart () method); when the dual-screen differently displayed Presentation object is completely invisible (e.g., executing the onstate () method), the Fragment control is correspondingly made completely invisible (e.g., executing the omfragment. Dispatchstop () method); and when the dual-screen highlighting Presentation object is destroyed (e.g., execute a destroy () method), correspondingly executing the destroy of the Fragment control (e.g., execute an mfragment.
In addition, the distribution of user interaction time is also by way of a stride Cheng Chuandi. Optionally, the method 10 may further include step S130: and transmitting the user interaction event received by the texture view TextureView control to the embedded application in a cross-process manner. For example, after a user interaction event is received by the texture view TextureView control in the host application, the type and coordinates of the user interaction event may be transferred to the dual-screen different display Presentation object of the embedded application. Specifically, when the user interaction event is a touch event, the TextureView control in the host application will trigger the onTouchEvent (MotionEvent) method of the system after receiving the touch event, and the method will be called once for each finger touch, and an object with touch information (i.e. a MotionEvent) is given at the same time, and then the MotionEvent object is transferred across applications to the dispatchTouchEvent (MotionEvent) method of the Presentation object of the embedded application, so that the transfer of the whole touch event between the host process and the embedded application is completed.
According to the method 10 of one or more embodiments of the present application, a view of an embedded application can be displayed on an interface of a host application, and a dual-screen different display Presentation object is used to replace an Activity component to load and manage Fragment controls of the embedded application, so that the problem that the Fragment controls cannot be loaded, the windows such as the dialogs controls and the popup window controls cannot be popped up due to the fact that the host is not the Activity component is avoided, and more dynamic and flexible user interface management is realized.
FIG. 2 is a block diagram of a computer system 20 in accordance with one or more embodiments of the application. Illustratively, the computer system 20 is a vehicle system. The computer system 20 includes a memory 210, a processor 220, and a computer program 230 stored on the memory 210 and executable on the processor 220, the execution of the computer program 230 causing the method 10 as shown in fig. 1 to be performed.
In addition, as described above, the present application may also be embodied as a computer storage medium in which a program for causing a computer to execute the method 10 shown in fig. 1 is stored. Here, as the computer storage medium, various types of computer storage media such as disks (e.g., magnetic disks, optical disks, etc.), cards (e.g., memory cards, optical cards, etc.), semiconductor memories (e.g., ROM, nonvolatile memory, etc.), tapes (e.g., magnetic tape, magnetic cassette, etc.), and the like can be employed.
Where applicable, hardware, software, or a combination of hardware and software may be used to implement the various embodiments provided by the present application. Moreover, where applicable, the various hardware components and/or software components set forth herein may be combined into composite components comprising software, hardware, and/or both without departing from the scope of the present application. Where applicable, the various hardware components and/or software components set forth herein can be separated into sub-components comprising software, hardware, or both without departing from the scope of the present application. Further, where applicable, it is contemplated that software components may be implemented as hardware components, and vice versa.
Software in accordance with the present application, such as program code and/or data, may be stored on one or more computer storage media. It is also contemplated that the software identified herein may be implemented using one or more general-purpose or special-purpose computers and/or computer systems that are networked and/or otherwise. Where applicable, the order of the various steps described herein may be changed, combined into composite steps, and/or divided into sub-steps to provide features described herein.
The embodiments and examples set forth herein are presented to best explain the embodiments consistent with the application and its particular application and to thereby enable those skilled in the art to make and use the application. However, those skilled in the art will recognize that the foregoing description and examples have been presented for the purpose of illustration and example only. The description as set forth is not intended to cover various aspects of the application or to limit the application to the precise form disclosed.

Claims (11)

1. A method for cross-application view inlining, the method comprising the steps of:
responsive to the starting of a host application, transmitting a surface object created by the host application to an embedded application, wherein the embedded application realizes user interface management based on a fragment control;
creating a virtual display object and a double-screen display object in the embedded application, and drawing a view of the embedded application on the surface layer object through the virtual display object and the double-screen display object so as to display the view of the embedded application in the host application, wherein the creation of the double-screen display object comprises:
and taking the double-screen different display object as a host of the fragment control, and loading and managing the fragment control by utilizing the double-screen different display object.
2. The method of claim 1, further comprising:
and adding a texture view control in the host application and setting parameters of the texture view control, wherein the texture view control designates a view display area of the embedded application, and the surface object is embedded in the texture view control.
3. The method of claim 2, further comprising:
in response to launching of the host application, parameters of the texture view control are sent to the embedded application, wherein the parameters include a width, a height, and a resolution of a view display area of the embedded application in the host application.
4. The method of claim 3, wherein sending parameters of the texture view control to the embedded application and passing surface objects created by the host application to the embedded application comprises:
responding to a calling request of the host application, and binding the service of the host application by the embedded application; and
after successful binding, the host application sends parameters of the surface object and the texture view control to the embedded application.
5. The method according to any one of claim 2 to 4, wherein,
creating a virtual display object and a dual-screen highlighting object in the embedded application includes:
creating the virtual display object based on the object and parameters of the texture view control, and
creating a double-screen different display object associated with the virtual display object by utilizing the virtual display object;
drawing the view of the embedded application on the surface object through the virtual display object and the double-screen display object comprises:
rendering a view of the embedded application into the virtual display object using the dual-screen highlighted object, and
the virtual display object draws rendered content onto a surface object provided by the host application.
6. The method of claim 1, wherein hosting the two-screen highlighted object as the segment control and loading and managing the segment control with the two-screen highlighted object comprises:
creating a segment controller object and designating the double-screen heteromorphic object as a host of the segment control in the creation process; and
and the life cycle state of the fragment control is consistent with the life cycle state of the double-screen different display object by realizing the corresponding life cycle function.
7. The method of claim 6, wherein causing the lifecycle state of the segment control to coincide with the lifecycle state of the dual-screen highlighted object by implementing a corresponding lifecycle function comprises:
when the double-screen highlighted object is initialized, correspondingly executing the initialization of the fragment control;
correspondingly changing the fragment control from the invisible state to the visible state when the double-screen highlighted object is changed from the invisible state to the visible state;
when the double-screen different display object is completely invisible, correspondingly making the fragment control completely invisible; and
and correspondingly executing the destruction of the fragment control when the double-screen highlighted object is destroyed.
8. The method of claim 2, further comprising:
and transmitting the user interaction event received by the texture view control to the embedded application in a cross-process manner.
9. The method of claim 8, wherein cross-procedural delivery of the user interaction event received by the texture view control to the embedded application comprises:
and transmitting the type and the coordinates of the user interaction event to the double-screen different-display object of the embedded application in response to the user interaction event received by the texture view control.
10. A computer system, the electronic device comprising: a memory; a processor; and a computer program stored on the memory and executable on the processor, the execution of the computer program causing the method according to any one of claims 1-9 to be performed.
11. A computer storage medium, characterized in that it comprises instructions which, when executed, perform the method according to any of claims 1-9.
CN202310807468.0A 2023-06-30 2023-06-30 Method, computer system, and storage medium for cross-application view embedding Pending CN116880941A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310807468.0A CN116880941A (en) 2023-06-30 2023-06-30 Method, computer system, and storage medium for cross-application view embedding

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310807468.0A CN116880941A (en) 2023-06-30 2023-06-30 Method, computer system, and storage medium for cross-application view embedding

Publications (1)

Publication Number Publication Date
CN116880941A true CN116880941A (en) 2023-10-13

Family

ID=88259645

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310807468.0A Pending CN116880941A (en) 2023-06-30 2023-06-30 Method, computer system, and storage medium for cross-application view embedding

Country Status (1)

Country Link
CN (1) CN116880941A (en)

Similar Documents

Publication Publication Date Title
CN110908625B (en) Multi-screen display method, device, equipment, system, cabin and storage medium
US8448090B2 (en) Selective plug-in activation for resource-limited platforms
US9715750B2 (en) System and method for layering using tile-based renderers
EP2715529B1 (en) Global composition system
US10915284B2 (en) Multi-monitor full screen mode in a windowing environment
US20060129948A1 (en) Method, system and program product for a window level security screen-saver
EP3805908A1 (en) Annotation display method, device and apparatus, and storage medium
JP2019512749A (en) Application program processing method and terminal device
JP2000148348A (en) Method and system capable of easily discriminating application being activated program and completing the same
CN110019464B (en) Page processing method and device
WO2018120992A1 (en) Window rendering method and terminal
US20130063445A1 (en) Composition System Thread
EP4257217A1 (en) Image processing method, electronic device, and storage medium
EP3680765A1 (en) Navigation bar control method and device
CN112035198A (en) Home page loading method, television and storage medium
CN107608588B (en) Display layer, display method, display system and operating system
CN116821040A (en) Display acceleration method, device and medium based on GPU direct memory access
KR20080009978A (en) Mobile terminal and Method for making of Menu Screen in thereof
CN112799801B (en) Method, device, equipment and medium for drawing simulated mouse pointer
CN116880941A (en) Method, computer system, and storage medium for cross-application view embedding
CN110806830A (en) User interaction method and electronic equipment
US20050140692A1 (en) Interoperability between immediate-mode and compositional mode windows
CN112667410A (en) Cross-process communication method, terminal and computer readable storage medium
CN114475469B (en) Reversing video quick display method, reversing video quick display system and central control vehicle machine
WO2024041423A1 (en) Information display method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination