CN115546410A - Window display method and device, electronic equipment and storage medium - Google Patents

Window display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115546410A
CN115546410A CN202211276051.8A CN202211276051A CN115546410A CN 115546410 A CN115546410 A CN 115546410A CN 202211276051 A CN202211276051 A CN 202211276051A CN 115546410 A CN115546410 A CN 115546410A
Authority
CN
China
Prior art keywords
window
dimensional
rendering
displayed
virtual scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211276051.8A
Other languages
Chinese (zh)
Inventor
王晓辰
侯清辰
石孟欧
杨行
陈昊芝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Positive Negative Infinite Technology Co ltd
Original Assignee
Beijing Positive Negative Infinite Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Positive Negative Infinite Technology Co ltd filed Critical Beijing Positive Negative Infinite Technology Co ltd
Publication of CN115546410A publication Critical patent/CN115546410A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Abstract

The embodiment of the application provides a method and a device for displaying a window in a three-dimensional virtual scene, electronic equipment and a computer readable storage medium, and relates to the technical field of computers. The method comprises the following steps: rendering a three-dimensional virtual scene and generating two-dimensional texture information of at least one window to be displayed, wherein each window is used for displaying an operation interface of an application program; acquiring a starting parameter corresponding to each window to be displayed, generating a rendering result according to the starting parameter corresponding to each window to be displayed, and storing the rendering result in a graphic cache region, wherein the rendering result is an operation interface of a corresponding application program; and obtaining the rendering result from the graphics cache region, generating the window by combining the two-dimensional texture information, and displaying the window in the three-dimensional virtual scene. The embodiment of the application saves CPU operation resources and bus bandwidth, can simultaneously support the display of more than 10 application program windows, and can also realize the free movement of the windows in a three-dimensional virtual scene.

Description

Window display method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for displaying a window in a three-dimensional virtual scene, an electronic device, a computer-readable storage medium, and a computer program product.
Background
With the development of virtual reality technology and the improvement of computer performance, in order to enable a user to have a convenient human-computer interaction mode, the virtual reality device can display a window of an application program, and the user can interact with the virtual reality device in various modes.
The existing native window display method of the android system does not support the display of windows in a three-dimensional virtual scene, and other custom methods support that the number of windows displayed simultaneously is small, and generally only 3 windows can be displayed simultaneously at most.
Disclosure of Invention
The embodiment of the application provides a method and a device for displaying a window in a three-dimensional virtual scene, an electronic device, a computer-readable storage medium and a computer program product, which can solve the above problems in the prior art. The technical scheme is as follows:
according to an aspect of an embodiment of the present application, a method for displaying a window in a three-dimensional virtual scene is provided, where the method includes:
rendering a three-dimensional virtual scene and generating two-dimensional texture information of at least one window to be displayed, wherein each window is used for displaying an operation interface of an application program;
acquiring a starting parameter corresponding to each window to be displayed, generating a rendering result according to the starting parameter corresponding to each window to be displayed, and storing the rendering result in a graphic cache region, wherein the rendering result is an operation interface of a corresponding application program;
and obtaining the rendering result from the graphics cache region, generating the window by combining the two-dimensional texture information, and displaying the window in the three-dimensional virtual scene.
As an optional embodiment, the starting parameters are pre-stored in a three-dimensional rendering engine, and the three-dimensional rendering engine is configured to render the three-dimensional virtual scene and generate two-dimensional texture information of at least one window to be displayed;
the method for acquiring the starting parameters corresponding to each window to be displayed comprises the following steps:
creating virtual display components which correspond to the windows to be displayed one by one, wherein the virtual display components are used for generating rendering results according to the starting parameters corresponding to the windows to be displayed and storing the rendering results in a graphic cache region;
the acquiring of the starting parameters corresponding to each window to be displayed includes:
and calling a preset window starting interface through the three-dimensional rendering engine, and transmitting each starting parameter to the corresponding virtual display component.
As an alternative embodiment, creating virtual display components in one-to-one correspondence with the windows to be displayed further includes: creating a picture reading component bound with each virtual display component;
the obtaining the rendering result from the graphics cache comprises:
monitoring the rendering progress of the virtual display component through the picture reading component, and when the virtual display component is determined to store a rendering result into a graph cache region, calling back the rendering result from the graph cache region by the picture reading component and sending the rendering result to a native rendering module;
the native rendering module is used for generating the window according to the obtained rendering result and the two-dimensional texture information, and displaying the window in the three-dimensional virtual scene.
As an optional embodiment, the three-dimensional rendering engine generates two-dimensional texture information of at least one window to be presented, and then further includes: and transmitting a handle of the two-dimensional texture information to the native rendering module, so that the native rendering module obtains the two-dimensional texture information according to the handle.
As an optional embodiment, the displaying the window in the three-dimensional virtual scene further includes:
sending, by the three-dimensional rendering engine, an input event to the native event module in response to the input event for a target window in the three-dimensional virtual scene;
and transmitting the input event to a virtual display component corresponding to the target window through a native event injection mechanism by the native event module, so that the virtual display component updates a rendering result according to the input event.
As an optional embodiment, the displaying the window in the three-dimensional virtual scene further includes:
in response to a request for calling up an input method panel, instructing the three-dimensional rendering engine to display a pre-created custom keyboard in the three-dimensional virtual scene through a pre-defined transfer service by a pre-defined input method service;
and responding to the operation of the user-defined keyboard, and sending key value information input by the operation to the input method service through the transfer service by the three-dimensional rendering engine.
As an alternative embodiment, in response to a request to invoke an input method panel, the method further comprises:
and establishing the connection between the transfer service and the input method service and the connection between the transfer service and the three-dimensional rendering engine through a cross-process communication technology.
According to another aspect of the embodiments of the present application, there is provided an apparatus for displaying windows in a three-dimensional virtual scene, the apparatus including:
the texture generation module is used for rendering a three-dimensional virtual scene and generating two-dimensional texture information of at least one window to be displayed, and each window is used for displaying an operation interface of an application program;
the rendering result generating module is used for acquiring the starting parameters corresponding to each window to be displayed, generating a rendering result according to the starting parameters corresponding to each window to be displayed and storing the rendering result into the graphic cache region, wherein the rendering result is an operation interface of a corresponding application program;
and the window display module is used for obtaining the rendering result from the graphic cache region, generating the window by combining the two-dimensional texture information and displaying the window in the three-dimensional virtual scene.
According to another aspect of embodiments of the present application, there is provided an electronic device including: memory, a processor and a computer program stored on the memory, the processor executing the computer program to perform the steps of the method of the above aspect.
According to yet another aspect of embodiments of the present application, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of the above aspect.
According to an aspect of embodiments of the present application, there is provided a computer program product comprising a computer program which, when executed by a processor, performs the steps of the method of the above aspect.
The technical scheme provided by the embodiment of the application has the following beneficial effects:
the method comprises the steps of generating two-dimensional texture information of at least one window to be displayed by rendering a three-dimensional virtual scene, obtaining starting parameters corresponding to the windows to be displayed, generating a rendering result according to the starting parameters corresponding to the windows to be displayed, storing the rendering result in a graph cache area, generating the windows by obtaining the rendering result from the graph cache area and combining the two-dimensional texture information, directly rendering the rendering result to two-dimensional textures by the graph cache area, saving CPU (central processing unit) computing resources and bus bandwidth, simultaneously supporting the display of more than 10 application programs, and far exceeding the window display scheme in the prior art.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments of the present application will be briefly described below.
Fig. 1 is a schematic flowchart of a method for displaying a window in a three-dimensional virtual scene according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a process of starting and displaying multiple windows according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating an input event of a delivery window according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of a method for inputting a keyboard in a window according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a window displaying apparatus in a three-dimensional virtual scene according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below in conjunction with the drawings in the present application. It should be understood that the embodiments set forth below in connection with the drawings are exemplary descriptions for explaining technical solutions of the embodiments of the present application, and do not limit the technical solutions of the embodiments of the present application.
As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, information, data, steps, operations, elements, and/or components, but do not preclude the presence or addition of other features, information, data, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. The term "and/or" as used herein indicates at least one of the items defined by the term, e.g., "a and/or B" may be implemented as "a", or as "B", or as "a and B".
To make the objects, technical solutions and advantages of the present application more clear, the following detailed description of the embodiments of the present application will be made with reference to the accompanying drawings.
The terms referred to in this application will first be introduced and explained:
virtual scenes, which are different from real world scenes output by devices, can form visual perception of the virtual scenes through naked eyes or assistance of the devices, such as two-dimensional images output by a display screen, and three-dimensional images output by stereoscopic display technologies such as stereoscopic projection, virtual reality and augmented reality technologies; in addition, various real-world-simulated perceptions such as auditory perception, tactile perception, olfactory perception, motion perception and the like can be formed through various possible hardware.
The window is a frame which divides the display screen of the microcomputer, namely the window. Each window is responsible for displaying and processing a certain type of information. The user can work on any window at will and exchange information between windows. There is special window management software in the computer to manage the window operations.
The window is the most important part of the user interface. It is a rectangular area on the screen corresponding to an application, including the frame and client area, and is the visual interface between the user and the application that generated the window. Each time a user begins to run an application, the application creates and displays a window; when the user manipulates an object in the window, the program reacts accordingly. The user terminates the running of a program by closing a window; the corresponding application is selected by selecting the corresponding application window.
Android (Android), a free and open source operating system based on the Linux kernel (which does not contain GNU components).
The application provides a method and a device for displaying a window in a three-dimensional virtual scene, an electronic device, a computer readable storage medium and a computer program product, which aim to solve the above technical problems in the prior art.
The technical solutions of the embodiments of the present application and the technical effects produced by the technical solutions of the present application will be described below through descriptions of several exemplary embodiments. It should be noted that the following embodiments may be referred to, referred to or combined with each other, and the description of the same terms, similar features, similar implementation steps, etc. in different embodiments is not repeated.
The embodiment of the application provides a method for displaying a window in a three-dimensional virtual scene, and as shown in fig. 1, the method comprises the following steps:
s101, rendering a three-dimensional virtual scene and generating two-dimensional texture information of at least one window to be displayed, wherein each window is used for displaying an operation interface of an application program.
The embodiment of the present application firstly needs to render a three-dimensional virtual scene to provide a display environment for a subsequent display window, and further, the embodiment of the present application also needs to generate two-dimensional texture information of at least one window to be displayed, the number of the windows to be displayed is not specifically limited, and the embodiment of the present application can support the display of more than 10 windows through actual measurement. Of course, the display can be performed normally even in the case of less than 10 windows.
The two-dimensional texture information of the window to be displayed is equivalent to a picture in the three-dimensional virtual scene, and the picture can be placed at any position. Along with the dragging operation of a user, the position coordinate of the picture is updated in real time, so that free movement is realized, a rectangular area capable of responding to an input event is placed outside the window, and the user can drive the window to move freely by controlling rays to drag the rectangular area through a controller; when the pointer is positioned at the edge of the window, the pointer is displayed in a zooming mode, and at the moment, the user controls the ray to perform a dragging operation through the controller, so that the window can be zoomed.
The two-dimensional texture information may include information such as the width, height, and color space of a corresponding window, and by generating the two-dimensional texture information, the normalized processing of the window size and color expression is realized. The color space of the embodiment of the application can be an RGB color space, an HSV color space, and the like. The RGB color space is a space that separates color signals into three attributes: red (red), green (green), and blue (blue), while the HSV color space separates color signals into three attributes: hue (Hue, H), saturation (S), brightness (Value, V).
In the embodiment of the application, one window corresponds to one application program, and the window is used for displaying an operation interface of the corresponding application program, it can be understood that one window displays a preset operation interface of the application program in an initial state, for example, a home page.
S102, obtaining a starting parameter corresponding to each window to be displayed, generating a rendering result according to the starting parameter corresponding to each window to be displayed, and storing the rendering result in a graphic cache region, wherein the rendering result is an operation interface of a corresponding application program.
It should be noted that, in the embodiment of the present application, the acquisition of the start-up parameter may be performed after step S101, or may be performed simultaneously with step S101. The launch parameters of the embodiment of the present application may be parameters related to launching a window and parameters related to launching an application, where the parameters related to launching the window may include resolution and the like, and the parameters related to launching the application may include parameters of a page, a link and the like corresponding to launching the window. By adjusting the parameters, the self-defined resolution and the semitransparent effect of the window can be realized.
According to the method and the device, the rendering result is generated according to the acquired starting parameters, the rendering result is the operation interface of the corresponding application program, and it is noted that the rendering result generated by the method and the device is directly stored in a Frame Buffer area (Frame Buffer). The content of the FrameBuffer corresponds to the interface display on the screen, and can be simply understood as the cache corresponding to the display content on the screen, and the content in the FrameBuffer is modified, namely the content on the screen is modified, so that the effect can be directly observed from the display by directly operating the FrameBuffer. FrameBuffer is a logical concept and does not have a fixed physical area on the video or memory. In fact, the physical property is a video memory or a memory, and as long as a section of memory or a video memory is arbitrarily allocated in a space range (a physical address space of a GPU) that can be accessed by a Graphics Processing Unit (GPU), the memory or the video memory can be used as a FrameBuffer.
S103, obtaining the rendering result from the graphics cache region, generating the window by combining the two-dimensional texture information, and displaying the window in the three-dimensional virtual scene.
According to the embodiment of the application, the rendering result is obtained from the graphic cache region, the rendering result is directly rendered to the two-dimensional texture from the graphic cache region, and CPU (Central processing Unit) operation resources and bus bandwidth are saved. Because the rendering result is the operation interface of the rendered application program, and the two-dimensional texture information is the basic information of the window, the two are combined to obtain the window displayed in the three-dimensional virtual scene.
According to the window display method in the three-dimensional virtual scene, the three-dimensional virtual scene is rendered to generate the two-dimensional texture information of at least one window to be displayed, the starting parameter corresponding to each window to be displayed is obtained, the rendering result is generated according to the starting parameter corresponding to each window to be displayed and stored in the graphic cache region, the rendering result is obtained from the graphic cache region and is combined with the two-dimensional texture information to generate the window, the rendering result is directly rendered to the two-dimensional texture from the graphic cache region, CPU (Central processing Unit) operation resources and bus bandwidth are saved, the display of more than 10 windows of application programs can be simultaneously supported, the window display scheme is far superior to that of the window in the prior art, and because the window is drawn based on the two-dimensional texture information, the two-dimensional texture is equivalent to a picture in the three-dimensional space, the picture can be placed at any position, and the position coordinates of the picture are updated in real time along with the dragging operation of a user, and the free movement of the window in the three-dimensional virtual scene is realized.
On the basis of the above embodiments, as an alternative embodiment, the start parameters are stored in the three-dimensional rendering engine in advance. In one embodiment, the three-dimensional rendering engine of the present application may be a Unity engine. The three-dimensional rendering engine of the embodiment of the present application is configured to render the three-dimensional virtual scene and generate two-dimensional texture information of at least one window to be displayed, that is, step S101 is executed by the three-dimensional rendering engine.
Obtaining a starting parameter corresponding to each window to be displayed, wherein the method comprises the following steps:
and creating virtual display components which correspond to the windows to be displayed one by one. Specifically, in the embodiment of the present application, a virtual display may be created through displaymanager.
The following is a specific method of creating a virtual display:
createVirtualDisplay(@NonNull String name,
int width,int height,int densityDpi,@Nullable Surface surface,int flags);
the parameters involved include:
name-the name of the virtual display, must not be empty.
width-the width (in pixels) of the virtual display, must be greater than 0.
height-the height of the virtual display (in pixels), must be greater than 0.
DensityDpi-the virtual display density in dpi, must be greater than 0.
surface-the surface to which the content of the virtual display should be rendered, null if not initially.
flags-combination of virtual display flags:
VIRTUAL_DISPLAY_FLAG_PUBLIC、
VIRTUAL_DISPLAY_FLAG_PRESENTATION、
VIRTUAL_DISPLAY_FLAG_SECURE、
VIRTUAL _ DISPLAY _ FLAG _ OWN _ CONTENT _ ONLY or
VIRTUAL_DISPLAY_FLAG_AUTO_MIRROR;
callback-callback called when the status of VirtualDisplay changes
Handler-the handler on which the listener should be called, null if the listener should be called on the Looper of the calling thread.
The content of the virtual display is rendered to the Surface provided by the application. The behavior of the virtual display depends on the flags provided to the method. By default, the virtual display is created as private, non-exposed, and insecure. Rights may be required using certain flags.
The virtual display component is used for generating a rendering result according to the starting parameter corresponding to the window to be displayed and storing the rendering result to the graphic cache region.
In one embodiment, the obtaining of the starting parameter corresponding to each window to be displayed includes:
and calling a preset window starting interface through a three-dimensional rendering engine, and transmitting each starting parameter to the corresponding virtual display component.
According to the embodiment of the application, the original virtual display assembly of the android system is created, the rendering result is generated by the virtual display assembly according to the starting parameter corresponding to the window to be displayed, and the rendering result is stored in the graphic cache region, so that the development difficulty is reduced, the practicability of the scheme can be enhanced, adaptation to different application programs is not needed, the starting parameter required by each virtual display assembly is transmitted by the three-dimensional rendering engine calling the window starting interface, and the window starting interface is an interface used for starting the window in the android system.
On the basis of the above embodiments, as an optional embodiment, when creating the virtual display component, the embodiment of the present application further includes creating a picture reading ImageReader component bound to each virtual display component. The ImageReader component allows an application to directly obtain the graphics data rendered to a surface.
Correspondingly, the rendering result is obtained from the graphics cache region, and the rendering result comprises:
monitoring the rendering progress of the virtual display assembly through the image reading assembly, and when the virtual display assembly is determined to store a rendering result into a graphic cache region, calling back the rendering result from the graphic cache region by the image reading assembly and sending the rendering result to a native rendering module.
According to the embodiment of the application, after the binding relationship between the picture reading assembly and the virtual display assembly is established, the picture reading assembly monitors the rendering progress of virtual display, and when the picture reading assembly determines that the virtual display assembly stores the rendering result into the graph cache region, the picture reading assembly recalls the rendering result from the graph cache region and sends the rendering result to the native rendering module. And the native rendering module is used for generating the window according to the obtained rendering result and the two-dimensional texture information and displaying the window in the three-dimensional virtual scene. Specifically, the native rendering module may implement rendering the rendering result on the two-dimensional texture information through a native interface of OpenGLES or Vulkan, so as to obtain the window.
On the basis of the foregoing embodiments, as an optional embodiment, the three-dimensional rendering engine generates two-dimensional texture information of at least one window to be displayed, and then further includes:
and transmitting a handle of the two-dimensional texture information to the native rendering module, so that the native rendering module obtains the two-dimensional texture information according to the handle. The Handle (Handle) is an identifier for identifying an object or item, and the embodiment of the application is used for describing a window, and the purpose of the Handle is to establish a unique connection with the accessed object, namely two-dimensional texture information of the window to be displayed.
Referring to fig. 2, a flowchart illustrating a process of starting and displaying multiple windows according to an embodiment of the present application is exemplarily shown, and as shown, the embodiment of the present application relates to a three-dimensional rendering engine layer and a native layer (i.e., an android native system).
The three-dimensional rendering engine layer renders a three-dimensional virtual scene, a rectangular area capable of responding to an input event is created in the three-dimensional virtual scene, the window is moved by controlling ray movement, the size of the window can be adjusted, the three-dimensional rendering engine layer creates two-dimensional texture information of at least one window to be displayed, a handle of the two-dimensional texture information is stored in a native application program, and a starting parameter corresponding to the window to be displayed is stored in the three-dimensional rendering engine layer in advance.
The android native system creates Virtual Display components and picture reading Image Reader components which correspond to the windows to be displayed one by one, binds the Virtual Display components and the picture reading components, and starts an application program in the Virtual Display components.
The method comprises the steps that a three-dimensional rendering engine layer calls a window starting interface of an android native system, starting parameters corresponding to a window to be displayed and stored in advance are sent to a virtual display assembly, the virtual display assembly generates a rendering result according to the starting parameters corresponding to the window to be displayed and stores the rendering result to a Frame Buffer of a graph cache area; and the picture reading component recalls the rendering result from the graphics cache region and sends the rendering result to a native rendering module.
And the three-dimensional rendering engine layer renders the event at a fixed frequency, correspondingly generates a native rendering event by the android native system, responds to the native rendering event, obtains the two-dimensional texture information according to the handle, draws the rendering result on the two-dimensional texture information to generate the window, and displays the window in the three-dimensional virtual scene.
On the basis of the foregoing embodiments, as an optional embodiment, the three-dimensional rendering engine generates two-dimensional texture information of at least one window to be displayed, and then further includes:
in response to an input event for a target window in the three-dimensional virtual scene, sending, by the three-dimensional rendering engine, the input event to a native event module;
and transmitting the input event to a virtual display component corresponding to the target window through a native event injection mechanism by a native event module so that the virtual display component updates a rendering result according to the input event.
In the embodiment of the present application, the input event refers to an event generated based on an interaction, and may include, for example, key triggering information, gesture recognition information, eyeball tracking information, and the like, which are acquired through a handle, a keyboard, and a VR head display. After the three-dimensional rendering engine captures an input event for a certain window, the three-dimensional rendering engine acquires the input event and sends the input event to the native event module, specifically, the three-dimensional rendering engine sends specific operation and parameter information of the input event to the native event module, the native event module is a module which is created in the android system and used for processing the input event in the embodiment of the application, the specific operation of the input event can be clicking, sliding, limb movement and the like, and the parameter information is used for representing magnitude information of the operation, such as clicking times, sliding angles, directions, frequency, angles, directions and the like of the limb movement. In some embodiments, the parameter information further includes coordinate information, and the coordinate information can describe a specific operation intention in combination with the input event, such as moving the window down by a preset distance.
In one embodiment, when an input event is transmitted, the three-dimensional rendering engine also carries a window identifier corresponding to the input event, so that the native event module transmits the input event to a virtual display component corresponding to the window identifier through a preset event injection mechanism, and the virtual display component further updates a rendering result according to the input event.
According to the method for transmitting the multi-window input event, aggregation of the input event is achieved through the three-dimensional rendering engine, the input event is sent to the primary event module through the three-dimensional rendering engine, the primary event module transmits the input event to the virtual display assembly through a primary event injection mechanism, and transmission of the input event can be completed in a light-weight mode.
Referring to fig. 3, a schematic flow chart of an input event of a delivery window according to an embodiment of the present application is exemplarily shown, and as shown, the embodiment of the present application relates to a three-dimensional rendering engine layer and a native layer (i.e., an android native system). In the three-dimensional rendering engine layer, if the triggered input event is determined to point to a certain target window, the input event (including parameter information) is transmitted to a native event module, and the native event module transmits the input event to a virtual display component corresponding to the target window through a native event injection mechanism, so that the virtual display component updates a rendering result according to the input event.
On the basis of the foregoing embodiments, as an optional embodiment, the displaying the window in the three-dimensional virtual scene further includes:
in response to a request for calling up an input method panel, instructing the three-dimensional rendering engine to display a pre-created custom keyboard in the three-dimensional virtual scene through a pre-defined transfer service by a pre-defined input method service;
and responding to the operation of the user-defined keyboard, and sending key value information input by the operation to the input method service through the transfer service by the three-dimensional rendering engine.
The method comprises the steps that a self-defined input method service replaces a native input method server in advance and is used for transmitting keyboard input events and data, a transfer service is realized again because the self-defined input method service cannot be passively connected with the built-in input method service, the connection between the transfer service and the input method service and the connection between the transfer service and a three-dimensional rendering engine are built through a cross-process communication technology, a self-defined keyboard is built in the three-dimensional rendering engine and replaces a native keyboard, the connection is built with the transfer server through the cross-process communication technology, and the keyboard input events and the data of a system are transmitted to a three-dimensional rendering layer through the transfer service.
When a user triggers the operation of calling the input method panel in the interface, responding to the request of calling the input method panel, and indicating the three-dimensional rendering engine to display a pre-created self-defined keyboard in the three-dimensional virtual scene through the transfer service by the self-defined input method service, so that the transfer service serves as a middleware between the three-dimensional rendering engine and the input method service in the android native system to realize the communication between the three-dimensional rendering engine and the input method service in the android native system.
After the user-defined keyboard is displayed, in response to the operation of the user-defined keyboard, the three-dimensional rendering engine sends key value information (such as ascii code information) input by the operation to the input sending server through the intermediate server, and the key value information acts on an input frame in the window after being sent to the input method service.
Referring to fig. 4, which schematically illustrates a flowchart of a keyboard input method in a window according to an embodiment of the present application, as shown in the drawing, a customized input method service is created to replace a native input method service, the customized input method server is configured to transmit a keyboard input event and data, in response to a request for waking up an input method panel, the input method server sends a display keyboard instruction to a relay service, the display keyboard instruction is configured to instruct a three-dimensional virtual to display a pre-created customized keyboard in a three-dimensional virtual scene, the relay service sends the display keyboard instruction to a three-dimensional virtual engine, and the three-dimensional virtual engine displays the customized keyboard according to the display keyboard instruction. In response to the operation of the user-defined keyboard, the three-dimensional rendering transmits the key value information (ascii code) to the transfer service, the transfer service forwards the key value information to the input method service, and the key value information acts on an input frame in the native application in the window after being transmitted to the input method service to perform input display.
An embodiment of the present application provides a window display device in a three-dimensional virtual scene, as shown in fig. 5, the window display device in the three-dimensional virtual scene may include: a texture generation module 101, a rendering result generation module 102, and a window presentation module 103, wherein,
the texture generation module 101 is configured to render a three-dimensional virtual scene and generate two-dimensional texture information of at least one window to be displayed, where each window is used to display an operation interface of an application program;
a rendering result generation module 102, configured to obtain a starting parameter corresponding to each window to be displayed, generate a rendering result according to the starting parameter corresponding to each window to be displayed, and store the rendering result in a graphics cache region, where the rendering result is an operation interface of a corresponding application program;
a window displaying module 103, configured to obtain the rendering result from the graphics cache region, generate the window in combination with the two-dimensional texture information, and display the window in the three-dimensional virtual scene.
The apparatus in the embodiment of the present application may execute the method provided in the embodiment of the present application, and the implementation principle is similar, the actions executed by the modules in the apparatus in the embodiments of the present application correspond to the steps in the method in the embodiments of the present application, and for the detailed functional description of the modules in the apparatus, reference may be made to the description in the corresponding method shown in the foregoing, and details are not repeated here.
The embodiment of the application provides an electronic device, which comprises a memory, a processor and a computer program stored on the memory, wherein the processor executes the computer program to realize the steps of a window display method in a three-dimensional virtual scene, and compared with the related technology, the method can realize the following steps: the method comprises the steps of generating two-dimensional texture information of at least one window to be displayed by rendering a three-dimensional virtual scene, obtaining starting parameters corresponding to the windows to be displayed, generating a rendering result according to the starting parameters corresponding to the windows to be displayed, storing the rendering result in a graphic cache region, obtaining the rendering result from the graphic cache region, generating the windows by combining the two-dimensional texture information, directly rendering the rendering result to two-dimensional textures from the graphic cache region, saving CPU (Central processing Unit) operation resources and bus bandwidth, simultaneously supporting the display of more than 10 application programs, and far exceeding the window display scheme in the prior art.
In an alternative embodiment, an electronic device is provided, as shown in fig. 6, the electronic device 4000 shown in fig. 6 comprising: a processor 4001 and a memory 4003. Processor 4001 is coupled to memory 4003, such as via bus 4002. Optionally, the electronic device 4000 may further include a transceiver 4004, and the transceiver 4004 may be used for data interaction between the electronic device and other electronic devices, such as transmission of data and/or reception of data. In addition, the transceiver 4004 is not limited to one in practical applications, and the structure of the electronic device 4000 is not limited to the embodiment of the present application.
The Processor 4001 may be a CPU (Central Processing Unit), a general-purpose Processor, a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor 4001 may also be a combination that performs a computational function, including, for example, a combination of one or more microprocessors, a combination of a DSP and a microprocessor, or the like.
Bus 4002 may include a path that carries information between the aforementioned components. The bus 4002 may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus 4002 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 6, but that does not indicate only one bus or one type of bus.
The Memory 4003 may be a ROM (Read Only Memory) or other types of static storage devices that can store static information and instructions, a RAM (Random Access Memory) or other types of dynamic storage devices that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory), a CD-ROM (Compact Disc Read Only Memory) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), a magnetic Disc storage medium, other magnetic storage devices, or any other medium that can be used to carry or store a computer program and that can be Read by a computer, without limitation.
The memory 4003 is used for storing computer programs for executing the embodiments of the present application, and is controlled by the processor 4001 to execute. The processor 4001 is used to execute computer programs stored in the memory 4003 to implement the steps shown in the foregoing method embodiments.
The embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, can implement the steps of the foregoing method embodiments and corresponding content.
Embodiments of the present application further provide a computer program product, which includes a computer program, and when the computer program is executed by a processor, the steps and corresponding contents of the foregoing method embodiments may be implemented.
The terms "first," "second," "third," "fourth," "1," "2," and the like in the description and claims of this application and in the preceding drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used are interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in other sequences than described or illustrated herein.
It should be understood that, although each operation step is indicated by an arrow in the flowchart of the embodiment of the present application, the implementation order of the steps is not limited to the order indicated by the arrow. In some implementation scenarios of the embodiments of the present application, the implementation steps in the flowcharts may be performed in other sequences as needed, unless explicitly stated otherwise herein. In addition, some or all of the steps in each flowchart may include multiple sub-steps or multiple stages based on an actual implementation scenario. Some or all of these sub-steps or stages may be performed at the same time, or each of these sub-steps or stages may be performed at different times, respectively. Under the scenario that the execution time is different, the execution sequence of the sub-steps or phases may be flexibly configured according to the requirement, which is not limited in the embodiment of the present application.
The foregoing is only an optional implementation manner of a part of implementation scenarios in this application, and it should be noted that, for those skilled in the art, other similar implementation means based on the technical idea of this application are also within the protection scope of the embodiments of this application without departing from the technical idea of this application.

Claims (10)

1. A window display method in a three-dimensional virtual scene is characterized by comprising the following steps:
rendering a three-dimensional virtual scene and generating two-dimensional texture information of at least one window to be displayed, wherein each window is used for displaying an operation interface of an application program;
acquiring a starting parameter corresponding to each window to be displayed, generating a rendering result according to the starting parameter corresponding to each window to be displayed, and storing the rendering result in a graphic cache region, wherein the rendering result is an operation interface of a corresponding application program;
and obtaining the rendering result from the graphics cache region, generating the window by combining the two-dimensional texture information, and displaying the window in the three-dimensional virtual scene.
2. The method according to claim 1, characterized in that the startup parameters are pre-stored in a three-dimensional rendering engine, which is used for rendering the three-dimensional virtual scene and generating two-dimensional texture information of at least one window to be displayed;
the method for acquiring the starting parameters corresponding to each window to be displayed comprises the following steps:
creating virtual display components which correspond to the windows to be displayed one by one, wherein the virtual display components are used for generating rendering results according to the starting parameters corresponding to the windows to be displayed and storing the rendering results in a graphic cache region;
the acquiring of the starting parameter corresponding to each window to be displayed includes:
and calling a preset window starting interface through the three-dimensional rendering engine, and transmitting each starting parameter to the corresponding virtual display component.
3. The method of claim 2, wherein creating a virtual display element in one-to-one correspondence with each window to be presented further comprises: creating a picture reading component bound with each virtual display component;
the obtaining the rendering result from the graphics cache comprises:
monitoring the rendering progress of the virtual display assembly through the image reading assembly, and when the virtual display assembly is determined to store a rendering result into a graphic cache region, calling back the rendering result from the graphic cache region by the image reading assembly and sending the rendering result to a native rendering module;
the native rendering module is used for generating the window according to the obtained rendering result and the two-dimensional texture information, and displaying the window in the three-dimensional virtual scene.
4. The method of claim 3, wherein the three-dimensional rendering engine generates two-dimensional texture information of at least one window to be displayed, and then further comprises: and transmitting a handle of the two-dimensional texture information to the native rendering module, so that the native rendering module obtains the two-dimensional texture information according to the handle.
5. The method of claim 3 or 4, wherein said presenting said window in said three-dimensional virtual scene further comprises:
in response to an input event for a target window in the three-dimensional virtual scene, sending, by the three-dimensional rendering engine, the input event to a native event module;
and transmitting the input event to a virtual display component corresponding to the target window through a native event injection mechanism by the native event module, so that the virtual display component updates a rendering result according to the input event.
6. The method of any of claims 1-5, wherein said presenting the window in the three-dimensional virtual scene further comprises:
in response to a request for calling up an input method panel, a predefined input method service instructs the three-dimensional rendering engine to display a pre-created custom keyboard in the three-dimensional virtual scene through a predefined transfer service;
and responding to the operation of the user-defined keyboard, and sending key value information input by the operation to the input method service through the transfer service by the three-dimensional rendering engine.
7. The method of claim 6, wherein responding to the request to invoke the input method panel further comprises:
and establishing the connection between the transfer service and the input method service and the connection between the transfer service and the three-dimensional rendering engine through a cross-process communication technology.
8. A window display device in a three-dimensional virtual scene is characterized by comprising:
the texture generation module is used for rendering a three-dimensional virtual scene and generating two-dimensional texture information of at least one window to be displayed, and each window is used for displaying an operation interface of an application program;
the rendering result generating module is used for acquiring the starting parameters corresponding to each window to be displayed, generating a rendering result according to the starting parameters corresponding to each window to be displayed and storing the rendering result into the graphic cache region, wherein the rendering result is an operation interface of a corresponding application program;
and the window display module is used for obtaining the rendering result from the graphic cache region, generating the window by combining the two-dimensional texture information and displaying the window in the three-dimensional virtual scene.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory, wherein the processor executes the computer program to implement the steps of the method for window presentation in a three-dimensional virtual scene according to any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, wherein the computer program, when being executed by a processor, implements the steps of the method for displaying windows in a three-dimensional virtual scene according to any one of claims 1 to 7.
CN202211276051.8A 2022-09-22 2022-10-18 Window display method and device, electronic equipment and storage medium Pending CN115546410A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2022111610710 2022-09-22
CN202211161071 2022-09-22

Publications (1)

Publication Number Publication Date
CN115546410A true CN115546410A (en) 2022-12-30

Family

ID=84735270

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211276051.8A Pending CN115546410A (en) 2022-09-22 2022-10-18 Window display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115546410A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115686727A (en) * 2023-01-04 2023-02-03 麒麟软件有限公司 Method for realizing synthetic rendering based on WLroots

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115686727A (en) * 2023-01-04 2023-02-03 麒麟软件有限公司 Method for realizing synthetic rendering based on WLroots
CN115686727B (en) * 2023-01-04 2023-04-14 麒麟软件有限公司 Method for realizing synthesis rendering based on wlroots

Similar Documents

Publication Publication Date Title
US20080074432A1 (en) Method for acquiring a computer screen image
US20120050300A1 (en) Architecture For Rendering Graphics On Output Devices Over Diverse Connections
US9182938B2 (en) Method for controlling multiple displays and system thereof
CN111414225B (en) Three-dimensional model remote display method, first terminal, electronic device and storage medium
CN110989878B (en) Animation display method and device in applet, electronic equipment and storage medium
CN109343922B (en) GPU (graphics processing Unit) virtual picture display method and device
TW201539294A (en) Cross-platform rendering engine
WO2022194003A1 (en) Screen capture method and apparatus, electronic device, and readable storage medium
WO2015179694A1 (en) Systems and methods for capturing graphical user interfaces
CN116821040B (en) Display acceleration method, device and medium based on GPU direct memory access
CN111754607A (en) Picture processing method and device, electronic equipment and computer readable storage medium
CN111679738B (en) Screen switching method and device, electronic equipment and storage medium
CN115546410A (en) Window display method and device, electronic equipment and storage medium
JP7160495B2 (en) Image preprocessing method, device, electronic device and storage medium
US11024257B2 (en) Android platform based display device and image display method thereof
CN112799801B (en) Method, device, equipment and medium for drawing simulated mouse pointer
CN112686939B (en) Depth image rendering method, device, equipment and computer readable storage medium
CN111260746A (en) Backing image processing method, electronic device and storage medium
CN111913711B (en) Video rendering method and device
CN114020396A (en) Display method of application program and data generation method of application program
CN113934500A (en) Rendering method, rendering device, storage medium and electronic equipment
CN110766599B (en) Method and system for preventing white screen from appearing when Qt Quick is used for drawing image
US20220391084A1 (en) Information display method, reader, computer storage medium, ink screen reading device and screen projection display system
CN113535056B (en) Frame selection adjustment method based on InkCanvas and related equipment
WO2023169089A1 (en) Video playing method and apparatus, electronic device, medium, and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination