WO2020238846A1 - 图像获取方法、装置、服务器及存储介质 - Google Patents
图像获取方法、装置、服务器及存储介质 Download PDFInfo
- Publication number
- WO2020238846A1 WO2020238846A1 PCT/CN2020/092076 CN2020092076W WO2020238846A1 WO 2020238846 A1 WO2020238846 A1 WO 2020238846A1 CN 2020092076 W CN2020092076 W CN 2020092076W WO 2020238846 A1 WO2020238846 A1 WO 2020238846A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- window
- image
- image data
- user
- binary tree
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 170
- 230000008569 process Effects 0.000 claims abstract description 110
- 238000009877 rendering Methods 0.000 claims abstract description 68
- 230000015572 biosynthetic process Effects 0.000 claims abstract description 54
- 238000003786 synthesis reaction Methods 0.000 claims abstract description 54
- 238000012545 processing Methods 0.000 claims abstract description 42
- 230000004044 response Effects 0.000 claims description 35
- 238000004590 computer program Methods 0.000 claims description 7
- 239000002131 composite material Substances 0.000 claims 1
- 230000006870 function Effects 0.000 description 48
- 238000010586 diagram Methods 0.000 description 28
- 244000035744 Hura crepitans Species 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- FFBHFFJDDLITSX-UHFFFAOYSA-N benzyl N-[2-hydroxy-4-(3-oxomorpholin-4-yl)phenyl]carbamate Chemical compound OC1=C(NC(=O)OCC2=CC=CC=C2)C=CC(=C1)N1CCOCC1=O FFBHFFJDDLITSX-UHFFFAOYSA-N 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 239000002699 waste material Substances 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/131—Protocols for games, networked simulations or virtual reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4781—Games
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/38—Creation or generation of source code for implementing user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/363—Graphics controllers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/34—Network arrangements or protocols for supporting network services or applications involving the movement of software or configuration parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/488—Data services, e.g. news ticker
- H04N21/4882—Data services, e.g. news ticker for displaying messages, e.g. warnings, reminders
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/35—Details of game servers
- A63F13/355—Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an encoded video stream for transmitting to a mobile phone or a thin client
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/20—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
- A63F2300/209—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform characterized by low level software layer, relating to hardware management, e.g. Operating System, Application Programming Interface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/452—Remote windowing, e.g. X-Window System, desktop virtualisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/455—Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/06—Use of more than one graphics processor to process data before displaying to one or more screens
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/02—Networking aspects
- G09G2370/022—Centralised management of display operation, e.g. in a server instead of locally
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/02—Networking aspects
- G09G2370/027—Arrangements and methods specific for the display of internet documents
Definitions
- This application relates to the field of Internet technology, and in particular to an image acquisition method, device, server and storage medium.
- Cloud gaming is an online gaming technology.
- the game runs on a cloud server.
- the cloud server renders the game scene as a game screen and transmits it to the user terminal in real time, so that the user of the user terminal can input corresponding operations on the game screen . It can be seen that cloud games can run high-quality games on user terminals with relatively limited graphics processing and data computing capabilities.
- cloud gaming has become a research hotspot in the gaming field in recent years.
- One of the key issues involved in the research on cloud games is how the cloud server obtains the game screen of the running game.
- the current common method to solve this problem is to obtain the screen based on the virtual machine. Specifically: one for each user terminal Virtual machine, the virtual machine renders the game scene to obtain the game screen, and the cloud server directly intercepts the desktop of the virtual machine to obtain the game screen.
- this method requires a large number of virtual machines to be deployed, which will occupy a large amount of server resources, resulting in a waste of server resources.
- the embodiments of the application provide an image acquisition method, device, server, and storage medium.
- the server intercepts window image data generated during the running of a target application program, and synthesizes the window image data to obtain the target
- the user interface image to be displayed corresponding to the application program saves the resource cost of arranging the virtual machine compared with the manner of obtaining the user interface image through the virtual machine in the prior art.
- an embodiment of the present application provides an image acquisition method, which is applied to a server, and includes:
- the notification message carrying the user interface image is sent to the user terminal corresponding to the user identification, so that the user terminal displays the user interface image on the user interface.
- an embodiment of the present application provides an image acquisition device, which is configured in a server and includes:
- the obtaining unit is used to obtain the target application process corresponding to the user ID from the application process set;
- the interception unit is configured to, when it is detected that the image rendering function in the target application process is called, call the data interception module to intercept the currently generated multiple window image data;
- the first processing unit is configured to perform image synthesis processing on the multiple window image data to obtain a user interface image to be displayed;
- the sending unit is configured to send a notification message carrying the user interface image to the user terminal corresponding to the user identification, so that the user terminal displays the user interface image on the user interface.
- an embodiment of the present application provides a server, and the server includes:
- a computer storage medium that stores one or more instructions, and the one or more instructions are suitable for being loaded by the processor and executing the image acquisition method described above.
- an embodiment of the present application provides a computer storage medium, the computer storage medium stores one or more instructions, and the one or more instructions are suitable for being loaded by a processor and executing the image acquisition as described above. method.
- Figure 1 is an example of an image acquisition architecture diagram
- FIG. 2 is an architecture diagram of image acquisition provided by an embodiment of the present application
- Figure 3a is a desktop view of a cloud server provided by an embodiment of the present application.
- FIG. 3b is a schematic diagram of a user interface image provided by an embodiment of the present application.
- FIG. 4 is a schematic flowchart of an image acquisition method provided by an embodiment of the present application.
- FIG. 5a is a schematic diagram of obtaining window image data according to an embodiment of the present application.
- 5b is a schematic diagram of window image data provided by an embodiment of the present application.
- Fig. 5c is a schematic diagram of another window image data provided by an embodiment of the present application.
- FIG. 5d is a schematic diagram of another window image data provided by an embodiment of the present application.
- Fig. 6a is a schematic flowchart of another image acquisition method provided by an embodiment of the present application.
- FIG. 6b is a schematic flowchart of another image acquisition method provided by an embodiment of the present application.
- FIG. 6c is a schematic flowchart of another image acquisition method provided by an embodiment of the present application.
- Figure 7a is a schematic diagram of an image polytree provided by an embodiment of the present application.
- Fig. 7b is a schematic diagram of an image binary tree provided by an embodiment of the present application.
- FIG. 8 is a schematic diagram of a data interception module provided by an embodiment of the present application.
- FIG. 9 is a schematic diagram of image synthesis processing provided by an embodiment of the present application.
- Fig. 10a is a desktop view of an application server provided by an embodiment of the present application.
- FIG. 10b is a schematic diagram of a user interface image provided by an embodiment of the present application.
- FIG. 11 is a schematic structural diagram of an image acquisition device provided by an embodiment of the present application.
- FIG. 12 is a schematic structural diagram of a server provided by an embodiment of the present application.
- an architecture diagram of image acquisition may be as shown in FIG.
- 101 represents an application server
- 101 can run multiple application processes
- 102 represents a target application process corresponding to a user ID among multiple application processes
- 103 represents a user terminal that has logged in the target application with the user ID
- 104 denotes a virtual machine corresponding to the user terminal 103.
- the application server needs to provide service support for the operation of the target application, or the application server is used to execute the application process that provides service support for realizing the application function of the target application A collection, where the target application can refer to any application, such as a game application, an instant chat application, and a shopping application.
- the application server provides service support for the target application based on the service program related to the target application. Specifically, the application server can support the target application to realize application functions by executing the service program. In the computer field, the activity of the application server running the service program is called the application process. If there are multiple user terminals running target applications at the same time, in order to ensure that the operation of the target applications in each user terminal does not interfere with each other, the application server creates an application process for each user to support the operation of their respective target applications. That is, the application server can execute a set of application processes that provide service support for realizing the application functions of the target application. The set of application processes includes multiple application processes, and each application process corresponds to a user ID.
- the target application process described in the embodiment of the present application may refer to an application process corresponding to any user identifier.
- the application server detects that user A and user B have After the operation instruction, an application process A and an application process B are respectively created for the user A and the user B to support the player experience game application of the two user terminals.
- the application server 101 receives an operation instruction for the target application sent by the user terminal 103, the application server 101 notifies the virtual machine 104 corresponding to the user ID in the user terminal 103 to run the target Application process 102, other virtual machines run application process 1 and application process 2 respectively.
- the application server 101 obtains the user interface image corresponding to the user identification by running the target application on the virtual machine 104.
- the method of obtaining the user interface image corresponding to the user identification through the virtual machine may be: the virtual machine 104 calls The image rendering function of the target application process 102 performs rendering to obtain the rendered window image data; then the virtual machine 104 synthesizes the window image data and other parameters to obtain the image to be displayed; it is displayed in the user interface of the virtual machine 104 The image to be displayed; the application server 101 intercepts the current user interface of the virtual machine 104, uses the intercepted image as a user interface image corresponding to the user identification, and sends the user interface image to the user terminal 103. In this way, the user identified by the user identifier can view the user interface image corresponding to the target application process 102 through the user terminal 103.
- the image acquisition architecture diagram may also be as shown in FIG. 2.
- 201 denotes an application server, and the application server 201 runs a target application program.
- the set of application processes supported by the application function providing service; 202 represents the target application process corresponding to the user ID in the application process set; 203 represents the user terminal that logs in the target application with the user ID, that is, the user corresponding to the user ID terminal.
- each application process in the application process set is also configured with a data interception module, for example, a data interception module 2021 is configured in the target application process, a data interception module configured in the application process 1, and in the application process The data interception module configured in 2.
- the application server 201 may call the data interception module 2021 in the target application process 202 to capture window image data.
- the window image data is executed during the execution of the target application process 202.
- the image rendering function in the image rendering function is obtained during rendering; then image synthesis is performed on the window image data to obtain the user interface image to be displayed.
- a picture synthesizer 204 may be configured in the application server 201, and the application server 201 may intercept the calling data interception module and send the window image data to the picture synthesizer 204 to call the picture synthesizer 204 to perform image synthesis on the window image data.
- the user interface image to be displayed is obtained by processing. After obtaining the user interface image, the application server 201 sends a notification message carrying the user interface image to the user terminal 203, so that the user terminal 203 can display the user interface image on the user interface.
- each application process runs on a corresponding virtual machine instead of running on an application server, resulting in low operating efficiency of the application process.
- an embodiment of the present application provides an image acquisition method, which can be applied to an application server, and the application server is used to execute applications for achieving the target application level A collection of application processes supported by the function providing service, such as the application server 101 in FIG. 1 or the application server 201 in FIG. 2.
- the image acquisition method described in the embodiment of the application the user interface image corresponding to a certain user ID can be obtained, and the obtained user interface image can be carried in a notification message and sent to the user terminal corresponding to the user ID to instruct the user
- the terminal displays the user interface image on the user interface. In this way, the user identified by the user identification can operate the target application through the user interface image.
- the embodiment of the present application saves the resource overhead of arranging the virtual machine in the application server.
- the image acquisition method described in the embodiments of this application can be applied to the application scenario of cloud games.
- the so-called cloud game refers to: games corresponding to multiple user identities are run on a cloud server, and the cloud server will After the game scene in the game is rendered as a game animation, it is transmitted to the user terminal corresponding to each user identifier in real time, so that the user identified by each user identifier can input corresponding operations on the game screen.
- FIG. 3a is a desktop view of a cloud server provided by an embodiment of this application.
- the cloud server is the application server 101 in FIG. 1 or the application server 201 in FIG. 2, and 300 in FIG. desktop.
- multiple application processes supporting game running are running on the cloud server at the same time.
- the cloud server can acquire game screens corresponding to multiple application processes.
- the desktop view can display game screens such as 301, 302, and 303 corresponding to each application process at the same time.
- the game screen corresponding to each application is the game screen corresponding to each user ID.
- 301 indicates a game screen corresponding to user identification A
- 302 indicates a game screen corresponding to user identification B
- 303 indicates a game screen corresponding to user identification C.
- the game screen corresponding to each user identification can be transmitted to the corresponding user terminal for display.
- the game screen of the user terminal The game screen seen in the user interface is the game screen corresponding to the user ID in the cloud server, and game screens corresponding to other user IDs will not be displayed. That is, the games corresponding to each user will not affect each other.
- the user interface image of the user terminal corresponding to the user ID A among multiple user IDs For example, the user terminal is the user terminal 103 in Fig. 1 and the user terminal 203 in Fig. 2. It can be seen that the user ID A Only the game screen 301 corresponding to the user identification A is displayed in the user interface of the corresponding user terminal.
- an embodiment of the present application provides a schematic flowchart of an image acquisition method, as shown in FIG. 4.
- the image acquisition method described in FIG. 4 may be executed by a server, and specifically may be executed by a processor of the server.
- the application server when applied to the application server 101 shown in FIG. 1 or the application server 201 shown in FIG. 2, the application server is used to execute a collection of application processes that provide service support for realizing the application functions of the target application.
- the image acquisition method described in FIG. 4 may include the following steps:
- the user identification is used to indicate the identity of the user, and the user identification may include any one or more of the user's ID card, the user's mobile phone number, and the user's login account.
- the acquisition of the target application process corresponding to the user identifier may be executed by the application server after receiving the user operation request sent by the user terminal.
- the user terminal refers to a terminal device that has logged in based on a user ID.
- the terminal device may include any device capable of video decompression, such as a mobile phone, a tablet, or a desktop computer; the user operation request may include a target application Any one or more of login operations, window switching operations, and logout operations.
- the user operation request sent by the user received by the user terminal at this time may refer to the login operation; if the user terminal has logged in based on the user ID For the target application, the user operation message at this time may refer to a logout operation, or a window switching operation on each window included in the target application, or a request to run a certain application function of the target application, etc.
- Fig. 5a is a schematic diagram of obtaining window image data according to an embodiment of the present application.
- the obtaining method may be executed by a server.
- the application server 101 in Fig. 1 or the application server 201 in Fig. 2 includes the following steps:
- S4023 Call the data interception module to obtain window image data generated by the image rendering function for each window.
- image rendering refers to a process of converting a three-dimensional light energy transfer process into a two-dimensional image
- an image rendering function refers to a function that can implement an image rendering function
- OpenGL Open Graphics Library
- GDI Graphics Device Interface
- DirectX programming interface
- OpenGL is a low-level 3D graphics library, which only provides rendering functions and can be transplanted between different platforms
- DirectX mainly requires hardware support for GPU design, and DirectX is more suitable for image rendering of game applications
- GDI is designed for broad spectrum , Can also be used in the image rendering of game applications, but the effect is not as good as DirectX.
- the rendering mode of each window in the image rendering function of the target application process may be different.
- the image rendering function includes The rendering mode can be GDI; when rendering the second window of the target application, the rendering mode included in the image rendering function can be OpenGL.
- the game application may include three windows, which are "homepages” respectively Window, "Store” window and "Video” window. Each window corresponds to a window image data. Specifically, the "Home” window image data is shown in Figure 5b, and the "Store” window image data is shown in Figure 5c. The window image data is shown in Figure 5d.
- S403 Perform image synthesis processing on multiple window image data to obtain a user interface image to be displayed.
- the image synthesis processing may refer to sequentially superimposing each window image data according to the synthesis sequence of each window image data.
- the synthesis order may refer to the order in which each window image data is processed in the image synthesis process, and the window image data with the earlier synthesis order is processed earlier in the image synthesis process.
- the synthesis sequence may be determined according to the window level identifier corresponding to each window image data.
- the window level identifier is used to indicate the level of the window. The higher the level indicated by the window level identifier, the lower the synthesis order of the window image data.
- the window level identifiers of the three window image data are "video", "store”, and "homepage” in descending order, and the higher window image is identified according to the above window level
- the synthesis order of the image data of the three windows in Figure 5b-5d can be "homepage", "store” and "video”; if the three window data are synthesized according to the synthesis order,
- the obtained user interface image to be displayed can be as shown in Figure 3b.
- the application server may call a screen synthesizer to synthesize the image data of each window.
- S404 Send the notification message carrying the user interface image to the user terminal corresponding to the user identification, so that the user terminal displays the user interface image on the user interface.
- the notification message is used to instruct the user terminal to display a user interface image on the user interface.
- a schematic diagram of the user terminal displaying the user interface image on the user interface may be shown in FIG. 3b.
- the application server in the embodiment of the present application intercepts the window image data obtained by calling the rendering function during the execution of the target application, and then performs image synthesis on the window image data
- the user interface image is processed, which can save the resource overhead of the application server.
- FIG. 6a is a schematic flowchart of another image acquisition method provided by an embodiment of the present application.
- the image acquisition method described in FIG. 6a may be executed by a server, and specifically may be executed by a processor of the server. For example, it can be applied to the application server 101 shown in FIG. 1 or the application server 201 shown in FIG. 2, and the application server is used to execute a set of application processes that provide service support for realizing the application functions of the target application.
- the image acquisition method described in FIG. 6a may include the following steps:
- S601 Obtain a target application process corresponding to a user identifier from the set of application processes.
- This step is the same as the above step 401, and will not be repeated here.
- Each window image data is obtained by executing one or more image rendering functions of the target application.
- the rendering method included in each image rendering function can be any of OpenGL, GDI, and DirectX. .
- S603 According to the rendering mode, determine a data interception sub-module corresponding to the rendering mode in the data interception module.
- the data interception module may include three interception sub-modules, and the three interception sub-modules respectively correspond to the above three rendering modes.
- the data interception module includes three interception sub-modules, which may be OpenGL interception sub-modules. Module, GDI interception submodule and DirectX interception submodule.
- the correspondence between each interception submodule and the rendering mode may be: the rendering mode corresponding to the OpenGL interception submodule is OpenGL, the rendering mode corresponding to the GDI interception submodule is GDI, and the rendering mode corresponding to the DirectX interception submodule is DirectX. It should be understood that calling the interception submodule corresponding to the rendering mode to obtain window image data can ensure the accuracy of the intercepted window image data.
- the schematic diagram of the data interception module provided by the embodiment of the present application may be as shown in FIG. 8.
- the data interception module shown in FIG. 8 may include a GDI interception sub-module, a DirectX interception sub-module, and an OpenGL interception sub-module.
- S604 Call the data interception sub-module to intercept multiple window image data obtained when the image rendering function is executed.
- step S605 While performing steps S602-S604, the following step S605 may also be performed:
- S605 Obtain screen display parameters set for the target application process, and determine the window level identifier of each window image data according to the screen display parameters.
- the screen display parameters may include any one or more of whether the window corresponding to each window image data is minimized, maximized, and the arrangement relationship between the windows, such as The arrangement relationship can be that window A is placed before window B; window C is placed after window B, and so on.
- the screen display parameters may be stored in a sandbox.
- the sandbox described in this embodiment of the application is a mechanism that can isolate multiple application processes running on the application server from each other.
- the screen display parameters set for the target application process stored in the sandbox may be parameters set by default for the target application process by the application server.
- the parameters of the target application process settings supported by the game function of the game application may include: the window corresponding to the "home page” window data is placed in the front, and the window corresponding to the image data of the "shop” window is placed in the window corresponding to the image data of the "home” window.
- the window corresponding to the image data of the "Video” window is placed behind the window corresponding to the image data of the "Store”; at this time, the screen display parameters can be: the image data of the "Home” window is displayed at the top, and the image data of the "Store” window is displayed After the “Home” window image data, the "Video” window image data is placed after the "Shop” window image data.
- the screen display parameters set for the target application process stored in the sandbox may be determined according to the user operation request sent by the user terminal and the default parameters set by the application server for the target application process.
- the parameters set by the target application process for providing service support for realizing the game functions of the game application may include: the image data of the "homepage" window corresponding to the first display, and the image of the "shop” window The window corresponding to the data is placed behind the window corresponding to the image data of the "Home" window, and the window corresponding to the image data of the "Video” window is placed behind the window corresponding to the image data of the "Shop” window; it is detected that the user operation request includes the "Home” The window corresponding to the window image data is switched to the window corresponding to the "video” window image data.
- the screen display parameters determined according to the user operation request and the parameters set by the application server for the application by default may include: The window is displayed at the top, the window corresponding to the image data of the "Shop” window is placed behind the window corresponding to the image data of the "Home” window, and the window corresponding to the image data of the "Home” window is placed behind the window corresponding to the image data of the "Video” window .
- the window level identifier of each window image data can be determined according to the screen display parameters.
- the window level identifier is used to uniquely indicate the corresponding window image data, and the window level identifier Indicates the hierarchical relationship between window image data.
- the window level identifier can be represented by one or more numbers, such as 1, 2, 3, or (1, 2), (2, 1), (3, 1), etc.; or the window level can also be represented by Other forms of expression.
- a number is used to represent a window level identifier, if the number of numbers used to represent each window level identifier is not the same, the higher the number of numbers, the higher the window level identifier, for example (1, 2) represents The window level identifier of is higher than the window level identifier represented by 1.
- N is any positive integer greater than 1, and the first N-1 numbers are the same, and the first If the N numbers are not the same, the smaller the sum of each number, the higher the window level identifier, for example, the window level identifier represented by (1, 2) is higher than the window level identifier represented by (1, 5).
- the window corresponding to the window level identifier (1, n) (n is any positive integer greater than or equal to 1) is a child window of the window corresponding to the window level identifier (1); the window corresponding to the window level identifier (1, n, m)
- the window is a child window of the window corresponding to the window level identifier (1, n). Based on the foregoing description, it can be known that the window level identifier corresponding to the child window is higher than the window level identifier corresponding to its parent window.
- S606 Perform image synthesis processing on multiple window image data according to the window level identification to obtain a user interface image to be displayed.
- the image synthesis processing is performed on the image data of the multiple windows according to the order of the hierarchical relationship from high to low.
- S607 Send the notification message carrying the user interface image to the user terminal corresponding to the user identifier, so that the user terminal displays the user interface image on the user interface.
- This step is the same as the above step S404, and will not be repeated here.
- FIG. 6b is a schematic flowchart of another image acquisition method provided by an embodiment of the present application.
- the image acquisition method described in FIG. 6b may be executed by a server, and specifically may be executed by a processor of the server. For example, it can be applied to the application server 101 shown in FIG. 1 or the application server 201 shown in FIG. 2.
- the application server is used to execute a set of application processes that provide service support for realizing the application functions of the target application, as shown in FIG. 6b.
- the image acquisition method described may include the following steps:
- S701 Obtain a target application process corresponding to the user identifier from the application process set.
- This step is the same as the above step 401, and will not be repeated here.
- S702 Obtain screen display parameters set for the target application process, and determine the window level identifier of each window image data according to the screen display parameters.
- This step is the same as the above step 605 and will not be repeated here.
- S703 Generate an image binary tree according to the window level identifier of the image data of each window.
- an image binary tree is generated according to the window level identifier of each window image data.
- the image binary tree may include multiple binary tree nodes, and each binary tree node stores a window level identifier.
- the generating an image binary tree according to the window level identifiers of the respective window image data may include:
- S7031 Construct an image polytree according to the window level identifiers of the image data of each window.
- the image polytree includes N levels, and each level includes M nodes, where N and M are both positive integers greater than or equal to 1, and two adjacent levels in the image polytree include parent and child
- the parent node and child node of the relationship, each node having the same parent node in the same level of the image polytree is a sibling node;
- S7032 Convert the image polytree into an image binary tree.
- the first node in each level of the image polytree is converted into the left node of the image binary tree, and other nodes that are siblings with the first node are converted into the right node of the image binary tree .
- an image multi-tree constructed according to the window level identification of each window image data is as shown in Figure 7a.
- the image binary tree includes 3 levels, as shown in 701-703.
- the numbers in the nodes of each level represent the node
- the window level identifier of the corresponding window image data can be seen from the figure.
- a node is included in the 701 level.
- the nodes included in the 701 level may be referred to as root nodes.
- the root node in this embodiment of the application is the desktop image data of the user terminal.
- the desktop image data of the user terminal may be sent by the application server after the user terminal receives the selection operation of the desktop image data input by the user; there are three nodes in the 702 level, and three nodes in the 703.
- the windows corresponding to the window level identifiers (1, 1) and (1, 2) in the 702 level are child windows of the window corresponding to the window level identifier 1, so the window level identifiers (1, 1) and (1, 2)
- the parent nodes of the represented child nodes are all nodes represented by the window level identifier 1, so the nodes represented by the window level identifiers (1, 1) and (1, 2) are sibling nodes.
- the window level identifiers 1, 2, and 3 represent each other's sibling nodes.
- Converting the image polytree shown in Figure 7a into an image binary tree refers to: converting the first node in each level 701-703 in the image polytree into the left node of the image binary tree, for example, the first node in the 701 level
- the node is the node represented by the window level identifier 0. This node is regarded as the left node of the image binary tree. If other nodes are not included in the 701 level, the conversion to the 702 level is continued; in the 702 level, the first node is the window level identifier 1. Representative node, continue to use this node as the left node of the image binary tree.
- the 702 level also includes two sibling nodes of the node represented by window level identifier 1, which are the node represented by window level identifier 2 and the node represented by window level identifier 3. , Regard these two nodes as the right node of the image binary tree in turn.
- the node represented by window level identifier 2 and the node represented by window level identifier 3 as the right node of the image binary tree in turn can be understood as: the node represented by window level identifier 2 is taken as the right node of the node represented by window level identifier 1.
- the node represented by the window level identifier 3 serves as the right node of the node represented by the window level identifier 2.
- S704 Traverse the image binary tree according to the preset first traversal order, to obtain the synthesis order of the image data of each window.
- the preset first (ie, binary tree image synthesis) traversal order may refer to the preorder traversal, and the preorder traversal refers to: root node-left node-right node.
- the result of the preorder traversal in the image binary tree shown in Fig. 7b is: 0-1-(1,1)-(1,2)-2-3-(3,1).
- the image binary tree is traversed according to the preset binary tree display traversal sequence, and the synthesis sequence of each window image data in the window image data is obtained 0-1-(1,1)-(1,2)-2-3-(3) ,1).
- S705 Traverse the image binary tree according to the preset second traversal order to obtain the image capturing order of each window image data, and call the data interception module to intercept and obtain multiple window image data according to the image capturing order.
- the preset second (binary image acquisition) traversal order may refer to the post-order traversal, and the traversal order of the post-order traversal is: left node-right node-root node.
- the result of the post-order traversal in the image binary tree shown in Figure 7b is: (1,1)-(1,2)-(3,1)-3-2-1-0.
- the image binary tree is traversed according to the preset traversal order of the binary tree image acquisition, and the image capturing order of each window image data in the window image data is obtained as (1, 1)-(1, 2)-(3, 1) -3-2-1-0.
- the window image data may be subjected to image synthesis processing to obtain the user interface image to be displayed.
- S706 Perform image synthesis processing on each window image data according to the synthesis sequence to obtain a user interface image to be displayed.
- step 706 may be: allocate a sub-canvas for each window image data, place the corresponding window image data on the corresponding sub-canvas, and arrange each sub-canvas according to the above-mentioned synthesis sequence Combine to get the user interface image to be displayed.
- the combination is performed in accordance with the above-mentioned synthesis sequence to realize that the window image data with the higher window level identifier is combined later.
- the window image data with the higher window level identifier in the obtained user interface image will cover the window level. Identifies low window image data.
- S707 Send the notification message carrying the user interface image to the user terminal corresponding to the user identification.
- This step is the same as the above step 404, and will not be repeated here.
- FIG. 6c is a schematic flowchart of another image acquisition method provided by an embodiment of the present application.
- the image acquisition method may be executed by a server, and specifically may be executed by a processor of the server.
- it can be applied to the application server 101 shown in FIG. 1 or the application server 201 shown in FIG. 2.
- Fig. 6c on the basis of the embodiment described in Fig. 6b, after step S702, it is also possible to perform:
- S708 Generate a message response binary tree according to the window level identifier of each window image data.
- the message response binary tree includes multiple message nodes, and each message node records window position information and window size information corresponding to corresponding window image data.
- the same method as step S703 can be used.
- the message response binary tree and the image binary tree have the same structure.
- the message binary tree generated according to the window level identifier of each window image data may also be as shown in FIG. 7b.
- Each node of the image binary tree stores the window level identifier of the window image data. Unlike the image binary tree, each node in the message response binary tree not only stores the window level identifier of the window image data, but also stores the window corresponding to each window image data.
- Position information and window size information may include the coordinates of the window, and the coordinates of the window may be expressed in the form of pixel coordinates, such as (3 pixels, 6 pixels), or expressed in the form of world coordinates, such as (10, 15) ;
- the window size information may include information such as the width and height of the window.
- S709 Determine display position information of each window image data based on the message response binary tree.
- the display position information may include any one or two of window position information and window size information of the window image data.
- S710 Perform synthesis processing on the image data of each window according to the display position information and the synthesis sequence, to obtain a user interface image to be displayed.
- the embodiment of the present application provides a schematic diagram of image synthesis processing as shown in FIG. 9.
- the data acquisition module is used to acquire the image data of each window in the target application process.
- Binary tree and message response binary tree perform image synthesis processing on the image data of each window.
- an implementation manner of determining the display position information of each window image data based on the message response binary tree may include:
- the application server can obtain the window level identifier of the image data of the first window according to the image binary tree; then search the window position information and window size information corresponding to the window level identifier of the first window data from the message response binary tree, and find the window position information And the window size information is determined as the display position information of the first window image data.
- the message response binary tree in addition to being applied to the image synthesis process, can also be used to find the target window image data corresponding to the user operation message, so that the application server can process the target window image data in response to the user operation message .
- the user operation message is an operation related to the user interface image input by the user through the user terminal.
- Figure 6c also includes the following steps:
- S711 Receive a user operation message sent by the user terminal, and obtain operation location information from the user operation message.
- the operation position information included in the user operation message refers to the position information at the position of the user input operation, for example, the user clicks, double-clicks, or long presses a certain position in the user interface image to determine the position information of this position Is the operation location information.
- S712 Traverse the message response binary tree according to the preset third traversal sequence, and determine the message node whose operation location information meets the preset location condition as the target message node.
- the target message node is the first message node found in the message response binary tree that meets a preset location condition, where the operation location information meeting the preset location condition means that the operation location information is in the Within the range defined by the window position information and window size information recorded by the target message node;
- S713 Use window image data corresponding to the target message node as target window image data, and process the window corresponding to the target window image data in response to the user operation message.
- the application server runs application processes corresponding to multiple user identities, and the application server displays user interface images corresponding to multiple user identities
- the user interface images corresponding to different user identities may overlap as shown in Figure 10a, in Figure 10a
- a and B represent user interface images corresponding to different user IDs; there may also be overlaps between the image data of each window in the user interface image corresponding to the same user ID, as shown in Figure 10b, in Figure 10b, A and B represent the same user ID corresponding to Two window image data in the user interface image.
- Different user IDs correspond to different application processes, and the screen display parameters corresponding to different application processes are different. Because the image binary tree and message binary tree are generated according to the window level identification determined by the screen display parameters, the image binary tree and message response between different application processes The binary trees are also different, so it can be understood that the message responses corresponding to different user identifiers are different.
- the application server needs to know which user interface image the user operation message input by the user is for.
- the binary tree of message responses corresponding to different user identities may be used to determine which user interface image the currently received user operation message is for.
- the application server needs to know which window image data in the user interface image the user operation message is for.
- the message response binary tree may be traversed according to a preset third (ie binary tree window search) traversal sequence to determine the target message node, and then determine the target window image data corresponding to the target node.
- the preset third traversal order may refer to middle-order traversal, and the middle-order traversal refers to traversing the binary tree in the order of: left node-root node-right node. For example, suppose that the message response binary tree is shown in Figure 7b, and the middle order traversal result is: (1,1)-(1,2)-1-2-(3,1)-(3)-0.
- an embodiment of the present application also discloses an image acquisition device, which can be applied to a server, for example, the application server 101 in FIG. 1 or the application server 201 in FIG. 2,
- the application server is used to execute a set of application processes that provide service support for realizing the application functions of the target application, and the image acquisition device can execute the methods shown in FIGS. 4 and 6a to 6c.
- the image acquisition device 1100 may run the following units:
- the obtaining unit 1101 is configured to obtain the target application process corresponding to the user identifier from the application process set;
- the interception unit 1102 is configured to, when it is detected that the image rendering function in the target application process acquired by the acquisition unit 1101 is called, call the data interception module to intercept the currently generated multiple window image data;
- the first processing unit 1103 is configured to perform image synthesis processing on the multiple window image data intercepted by the intercepting unit 1102 to obtain a user interface image to be displayed;
- the sending unit 1104 is configured to send a notification message carrying the user interface image obtained by the first processing unit 1103 to the user terminal corresponding to the user identification, so that the user terminal displays the user interface image on the user interface .
- the obtaining unit 1101 is further configured to obtain the screen display parameters set for the target application process
- the device 1100 also includes:
- the determining unit 1105 is configured to determine the window level identifier of each window image data according to the screen display parameters acquired by the acquiring unit 1102;
- the first processing unit 1103 is further configured to perform image synthesis processing on the multiple window image data according to the window level identifier determined by the determining unit 1105.
- the apparatus 1100 further includes:
- the first generating unit 1106 is configured to generate an image binary tree according to the window level identifier of each window image data determined by the determining unit 1105;
- the first processing unit 1103 is further configured to traverse the binary tree of images generated by the first generating unit 1106 according to the preset first traversal order to obtain the synthesis order of the image data of each window;
- the multiple window image data are processed for image synthesis.
- the first generating unit 1106 is configured to construct an image multi-tree according to the window level identifiers of the image data of each window determined by the determining unit 1105; and convert the image multi-tree to the image binary tree .
- the apparatus 1100 further includes:
- the first generating unit 1106 is configured to generate an image binary tree according to the window level identifier of each window image data determined by the determining unit 1105;
- the intercepting unit 1102 is configured to traverse the binary tree of images generated by the first generating unit 1106 according to the preset second traversal order to obtain the image capturing order of each window image data; according to the image capturing order, call the The data interception module intercepts and obtains the multiple window image data.
- the interception unit 1102 is configured to determine the rendering mode included in the image rendering function; according to the rendering mode, determine the data interception submodule in the data interception module corresponding to the rendering mode; call The data interception sub-module intercepts multiple window image data obtained when the image rendering function is executed.
- the apparatus 1100 further includes:
- the second generating unit 1107 is configured to generate a message response binary tree according to the window level identifier of each window image data determined by the determining unit 1105;
- the first processing unit 1103 is configured to determine the display position information of each window image data based on the message response binary tree generated by the second generation unit 1107; according to the display position information and the synthesis sequence, perform the Image data is synthesized.
- the message response binary tree includes multiple message nodes
- the obtaining unit 1101 is further configured to receive a user operation message sent by the user terminal, and obtain operation position information from the user operation message;
- the device 1100 also includes:
- the second processing unit 1108 is configured to traverse the message response binary tree generated by the second generating unit 1107 according to the preset third traversal sequence, and convert the message nodes whose operating position information obtained by the obtaining unit 1101 meets the preset position conditions , Determine it as a target message node; take the window image data corresponding to the target message node as target window image data, and process the window corresponding to the target window image data in response to the user operation message.
- the units in the image acquisition device shown in FIG. 11 can be separately or completely combined into one or several other units to form, or some of the units can be disassembled. It is composed of multiple units with smaller functions, which can achieve the same operation without affecting the realization of the technical effects of the embodiments of the present application.
- the above-mentioned units are divided based on logical functions.
- the function of one unit may also be realized by multiple units, or the functions of multiple units may be realized by one unit.
- the image-based acquisition device may also include other units. In practical applications, these functions may also be implemented with the assistance of other units, and may be implemented by multiple units in cooperation.
- a general-purpose computing device such as a computer including a central processing unit (CPU), a random access storage medium (RAM), a read-only storage medium (ROM) and other processing elements and storage elements
- CPU central processing unit
- RAM random access storage medium
- ROM read-only storage medium
- Run a computer program (including program code) capable of executing the steps involved in the corresponding method shown in FIG. 4 or FIG. 6a-6c to construct the image acquisition device shown in FIG. 11, and to implement the application Example image acquisition method.
- the computer program can be recorded on, for example, a computer-readable recording medium, and loaded into the aforementioned computing device via the computer-readable recording medium, and run in it.
- an embodiment of the present application further provides a server.
- the server may be the application server 101 shown in FIG. 1 or the application server 201 shown in FIG. Used to execute a collection of application processes that provide service support for realizing the application functions of the target application.
- the server includes at least a processor 1201, a network interface 1202, and a computer storage medium 1203.
- the network interface 1202 is used to receive or send data when connecting to the network.
- the computer storage medium 1203 may be stored in the memory of the server.
- the computer storage medium 1203 is used to store a computer program, the computer program includes program instructions, and the processor 1201 is used to execute the program instructions stored in the computer storage medium 1203. .
- the processor 1201 or CPU Central Processing Unit
- the processor 1201 or CPU is the computing core and control core of the terminal. It is suitable for implementing one or more instructions, specifically suitable for loading and executing one or more instructions to implement the corresponding method Process or corresponding function; in one embodiment, the processor 1201 described in the embodiment of the present application may be used to execute the image acquisition method described in FIGS. 4 and 6a-6c.
- the embodiment of the present application also provides a computer storage medium (Memory).
- the computer storage medium is a memory device in a terminal for storing programs and data. It is understandable that the computer storage medium herein may include a built-in storage medium in the terminal, and of course, may also include an extended storage medium supported by the terminal.
- the computer storage medium provides storage space, and the storage space stores the operating system of the terminal.
- one or more instructions suitable for being loaded and executed by the processor 1201 are stored in the storage space, and these instructions may be one or more computer programs (including program codes).
- the computer storage medium here may be a high-speed RAM memory, or a non-volatile memory (non-volatile memory), such as at least one disk memory; optionally, it may also be at least one located far away from the aforementioned processor.
- Computer storage media may be a high-speed RAM memory, or a non-volatile memory (non-volatile memory), such as at least one disk memory; optionally, it may also be at least one located far away from the aforementioned processor.
- the processor 1201 can load and execute one or more instructions stored in the computer storage medium to implement the corresponding steps of the method in the above-mentioned image acquisition method embodiment; in a specific implementation, the computer storage medium One or more instructions are loaded by the processor 1201 and execute the image acquisition methods described in FIGS. 4 and 6a-6c.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- User Interface Of Digital Computer (AREA)
- Image Generation (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Digital Computer Display Output (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
Claims (19)
- 一种图像获取方法,其特征在于,由服务器执行,包括:从应用进程集合中获取用户标识对应的目标应用进程;当检测到所述目标应用进程中的图像渲染函数被调用时,调用数据截取模块,截取当前生成的多个窗口图像数据;对所述多个窗口图像数据进行图像合成处理,得到待显示的用户界面图像;将携带所述用户界面图像的通知消息发送给所述用户标识对应的用户终端,以使所述用户终端在用户界面上显示所述用户界面图像。
- 如权利要求1所述的方法,还包括:获取为所述目标应用进程设置的画面显示参数;根据所述画面显示参数,确定各个窗口图像数据的窗口层级标识;其中,所述对所述多个窗口图像数据进行图像合成处理,得到待显示的用户界面图像,包括:根据所述窗口层级标识,对所述多个窗口图像数据进行图像合成处理。
- 如权利要求2所述的方法,还包括:根据各个窗口图像数据的窗口层级标识,生成图像二叉树;所述根据所述窗口层级标识,对所述多个窗口图像数据进行图像合成处理,包括:按照预设的第一遍历顺序,对所述图像二叉树进行遍历,得到各个窗口图像数据的合成顺序;按照所述合成顺序,对所述多个窗口图像数据进行图像合成处理。
- 如权利要求3所述的方法,其中,所述根据各个窗口图像数据的窗口层级标 识,生成图像二叉树,包括:根据各个窗口图像数据的窗口层级标识,构建图像多叉树;将所述图像多叉树转换为所述图像二叉树。
- 如权利要求2所述的方法,还包括:根据各个窗口图像数据的窗口层级标识,生成图像二叉树;所述当检测到所述目标应用进程中的图像渲染函数被调用时,调用数据截取模块,截取当前生成的多个窗口图像数据,包括:按照预设的第二遍历顺序,对所述图像二叉树进行遍历,得到各个窗口图像数据的图像捕获顺序;按照所述图像捕获顺序,调用所述数据截取模块,截取得到所述多个窗口图像数据。
- 如权利要求1所述的方法,其中,所述当检测到所述目标应用进程中的图像渲染函数被调用时,调用数据截取模块,截取当前生成的多个窗口图像数据,包括:确定所述图像渲染函数中包括的渲染方式;根据所述渲染方式,确定所述数据截取模块中与所述渲染方式对应的数据截取子模块;调用所述数据截取子模块,截取在执行所述图像渲染函数时得到的多个窗口图像数据。
- 如权利要求3所述的方法,还包括:根据各个窗口图像数据的窗口层级标识,生成消息响应二叉树;其中,所述按照所述合成顺序,对所述多个窗口图像数据进行图像合成处理,包括:基于所述消息响应二叉树,确定各个窗口图像数据的显示位置信息;根据所述显示位置信息和所述合成顺序,对所述多个窗口图像数据进行合成处理。
- 如权利要求7所述的方法,其中,所述消息响应二叉树中包括多个消息节点,所述方法还包括:接收所述用户终端发送的用户操作消息,从所述用户操作消息中获取操作位置信息;按照预设的第三遍历顺序,遍历所述消息响应二叉树,将所述操作位置信息满足预设位置条件的消息节点,确定为目标消息节点;将所述目标消息节点对应的窗口图像数据作为目标窗口图像数据,响应于所述用户操作消息,对所述目标窗口图像数据所对应的窗口进行处理。
- 如权利要求8所述的方法,其中,每个消息节点中记录有相应的窗口图像数据对应的窗口位置信息和窗口尺寸信息;所述操作位置信息满足预设位置条件,是指所述操作位置信息在所述目标消息节点记录的窗口位置信息和窗口尺寸信息所限定的范围内。
- 一种图像获取装置,其特征在于,所述装置配置于服务器中,所述装置包括:获取单元,用于从应用进程集合中获取用户标识对应的目标应用进程;截取单元,用于当检测到所述获取单元获取到的目标应用进程中的图像渲染函数被调用时,调用数据截取模块,截取当前生成的多个窗口图像数据;第一处理单元,用于对所述截取单元截取到的多个窗口图像数据进行图像合成处理,得到待显示的用户界面图像;发送单元,用于将携带所述第一处理单元得到的用户界面图像的通知消息发送给所述用户标识对应的用户终端,以使所述用户终端在用户界面上显示所述用户界 面图像。
- 如权利要求10所述的装置,其中,所述获取单元还用于,获取为所述目标应用进程设置的画面显示参数;所述装置还包括:确定单元,用于根据所述获取单元获取的画面显示参数,确定各个窗口图像数据的窗口层级标识;所述第一处理单元还用于,根据所述确定单元确定的窗口层级标识,对所述多个窗口图像数据进行图像合成处理。
- 如权利要求11所述的装置,还包括:第一生成单元,用于根据所述确定单元确定的各个窗口图像数据的窗口层级标识,生成图像二叉树;所述第一处理单元还用于,按照预设的第一遍历顺序,对所述第一生成单元生成的图像二叉树进行遍历,得到各个窗口图像数据的合成顺序;按照所述合成顺序,对所述多个窗口图像数据进行图像合成处理。
- 如权利要求12所述的装置,其中,所述第一生成单元用于,根据所述确定单元确定的各个窗口图像数据的窗口层级标识,构建图像多叉树;将所述图像多叉树转换为所述图像二叉树。
- 如权利要求11所述的装置,还包括:第一生成单元,用于根据所述确定单元确定的各个窗口图像数据的窗口层级标识,生成图像二叉树;其中,所述截取单元用于,按照预设的第二遍历顺序,对所述第一生成单元生成的图像二叉树进行遍历,得到各个窗口图像数据的图像捕获顺序;按照所述图像 捕获顺序,调用所述数据截取模块,截取得到所述多个窗口图像数据。
- 如权利要求10所述的装置,其中,所述截取单元用于,确定所述图像渲染函数中包括的渲染方式;根据所述渲染方式,确定所述数据截取模块中与所述渲染方式对应的数据截取子模块;调用所述数据截取子模块,截取在执行所述图像渲染函数时得到的多个窗口图像数据。
- 如权利要求12所述的装置,还包括:第二生成单元,用于根据所述确定单元确定的各个窗口图像数据的窗口层级标识,生成消息响应二叉树;其中,所述第一处理单元用于,基于所述第二生成单元生成的消息响应二叉树,确定各个窗口图像数据的显示位置信息;根据所述显示位置信息和所述合成顺序,对所述多个窗口图像数据进行合成处理。
- 如权利要求16所述的装置,其中,所述消息响应二叉树中包括多个消息节点,所述获取单元还用于,接收所述用户终端发送的用户操作消息,从所述用户操作消息中获取操作位置信息;所述装置还包括:第二处理单元用于,按照预设的第三遍历顺序,遍历所述第二生成单元生成的消息响应二叉树,将所述获取单元获取的操作位置信息满足预设位置条件的消息节点,确定为目标消息节点;将所述目标消息节点对应的窗口图像数据作为目标窗口图像数据,响应于所述用户操作消息,对所述目标窗口图像数据所对应的窗口进行处理。
- 一种服务器,其特征在于,还包括:处理器,适于实现一条或多条指令;以及,计算机存储介质,所述计算机存储介质存储有一条或多条指令,所述一条或多条指令适于由所述处理器加载并执行如权利要求1-9任一项所述的图像获取方法。
- 一种计算机存储介质,其特征在于,所述计算机存储介质中存储有计算机程序指令,所述计算机程序指令被处理器执行时,用于执行如权利要求1-9任一项所述的图像获取方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SG11202105855QA SG11202105855QA (en) | 2019-05-29 | 2020-05-25 | Image acquisition method, device, server and storage medium |
KR1020217023112A KR102425168B1 (ko) | 2019-05-29 | 2020-05-25 | 이미지 획득 방법, 기기, 서버 및 저장 매체 |
EP20813666.3A EP3979589A4 (en) | 2019-05-29 | 2020-05-25 | IMAGE CAPTURE METHOD, DEVICE, SERVER AND STORAGE MEDIUM |
JP2021548678A JP7297080B2 (ja) | 2019-05-29 | 2020-05-25 | 画像取得方法、画像取得装置、サーバ、及びコンピュータプログラム |
US17/443,481 US11606436B2 (en) | 2019-05-29 | 2021-07-27 | Image obtaining method and apparatus, server, and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910468135.3 | 2019-05-29 | ||
CN201910468135.3A CN110213265B (zh) | 2019-05-29 | 2019-05-29 | 图像获取方法、装置、服务器及存储介质 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/443,481 Continuation US11606436B2 (en) | 2019-05-29 | 2021-07-27 | Image obtaining method and apparatus, server, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020238846A1 true WO2020238846A1 (zh) | 2020-12-03 |
Family
ID=67789895
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/092076 WO2020238846A1 (zh) | 2019-05-29 | 2020-05-25 | 图像获取方法、装置、服务器及存储介质 |
Country Status (7)
Country | Link |
---|---|
US (1) | US11606436B2 (zh) |
EP (1) | EP3979589A4 (zh) |
JP (1) | JP7297080B2 (zh) |
KR (1) | KR102425168B1 (zh) |
CN (1) | CN110213265B (zh) |
SG (1) | SG11202105855QA (zh) |
WO (1) | WO2020238846A1 (zh) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110213265B (zh) | 2019-05-29 | 2021-05-28 | 腾讯科技(深圳)有限公司 | 图像获取方法、装置、服务器及存储介质 |
CN112099884A (zh) * | 2020-08-11 | 2020-12-18 | 西安万像电子科技有限公司 | 一种图像渲染方法及装置 |
CN112650899B (zh) * | 2020-12-30 | 2023-10-03 | 中国平安人寿保险股份有限公司 | 数据可视化渲染方法、装置、计算机设备及存储介质 |
CN114820882A (zh) * | 2022-04-19 | 2022-07-29 | 北京百度网讯科技有限公司 | 一种图像获取方法、装置、设备及存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160379400A1 (en) * | 2015-06-23 | 2016-12-29 | Intel Corporation | Three-Dimensional Renderer |
CN106843897A (zh) * | 2017-02-09 | 2017-06-13 | 腾讯科技(深圳)有限公司 | 一种截取游戏画面的方法和装置 |
CN107071331A (zh) * | 2017-03-08 | 2017-08-18 | 苏睿 | 图像显示方法、装置和系统、存储介质及处理器 |
CN108939556A (zh) * | 2018-07-27 | 2018-12-07 | 珠海金山网络游戏科技有限公司 | 一种基于游戏平台的截图方法及装置 |
CN110213265A (zh) * | 2019-05-29 | 2019-09-06 | 腾讯科技(深圳)有限公司 | 图像获取方法、装置、服务器及存储介质 |
CN109582425B (zh) * | 2018-12-04 | 2020-04-14 | 中山大学 | 一种基于云端与终端gpu融合的gpu服务重定向系统及方法 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6687335B1 (en) * | 1997-03-31 | 2004-02-03 | Southwestern Bell Telephone Company | User interface and system to facilitate telephone circuit maintenance and testing |
JP2003190640A (ja) * | 2001-12-27 | 2003-07-08 | Koei:Kk | 描画コマンド制御方法、記録媒体、描画コマンド制御装置及びプログラム |
US7486294B2 (en) * | 2003-03-27 | 2009-02-03 | Microsoft Corporation | Vector graphics element-based model, application programming interface, and markup language |
US7260784B2 (en) * | 2003-05-07 | 2007-08-21 | International Business Machines Corporation | Display data mapping method, system, and program product |
US8079037B2 (en) * | 2005-10-11 | 2011-12-13 | Knoa Software, Inc. | Generic, multi-instance method and GUI detection system for tracking and monitoring computer applications |
US20070300179A1 (en) * | 2006-06-27 | 2007-12-27 | Observe It Ltd. | User-application interaction recording |
US20110289117A1 (en) * | 2010-05-19 | 2011-11-24 | International Business Machines Corporation | Systems and methods for user controllable, automated recording and searching of computer activity |
US9361464B2 (en) * | 2012-04-24 | 2016-06-07 | Jianqing Wu | Versatile log system |
CN103455234A (zh) | 2012-06-01 | 2013-12-18 | 腾讯科技(深圳)有限公司 | 显示应用程序界面的方法及装置 |
US9069608B2 (en) * | 2013-03-06 | 2015-06-30 | Vmware, Inc. | Method and system for providing a roaming remote desktop |
US10387546B1 (en) * | 2013-06-07 | 2019-08-20 | United Services Automobile Association | Web browsing |
EP3092622A4 (en) * | 2014-01-09 | 2017-08-30 | Square Enix Holdings Co., Ltd. | Methods and systems for efficient rendering of game screens for multi-player video game |
KR102455232B1 (ko) * | 2015-06-23 | 2022-10-17 | 삼성전자 주식회사 | 콘텍스트 기반 탭 관리를 위한 방법 및 전자 장치 |
US10179290B2 (en) * | 2016-07-21 | 2019-01-15 | Sony Interactive Entertainment America Llc | Method and system for accessing previously stored game play via video recording as executed on a game cloud system |
CN106371824A (zh) * | 2016-08-23 | 2017-02-01 | 广州优视网络科技有限公司 | 便携式设备及应用程序弹出消息显示控制方法和装置 |
US10180815B1 (en) * | 2017-09-06 | 2019-01-15 | Xmpie (Israel) Ltd. | Systems and methods for variable data printing |
-
2019
- 2019-05-29 CN CN201910468135.3A patent/CN110213265B/zh active Active
-
2020
- 2020-05-25 JP JP2021548678A patent/JP7297080B2/ja active Active
- 2020-05-25 KR KR1020217023112A patent/KR102425168B1/ko active IP Right Grant
- 2020-05-25 SG SG11202105855QA patent/SG11202105855QA/en unknown
- 2020-05-25 EP EP20813666.3A patent/EP3979589A4/en active Pending
- 2020-05-25 WO PCT/CN2020/092076 patent/WO2020238846A1/zh unknown
-
2021
- 2021-07-27 US US17/443,481 patent/US11606436B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160379400A1 (en) * | 2015-06-23 | 2016-12-29 | Intel Corporation | Three-Dimensional Renderer |
CN106843897A (zh) * | 2017-02-09 | 2017-06-13 | 腾讯科技(深圳)有限公司 | 一种截取游戏画面的方法和装置 |
CN107071331A (zh) * | 2017-03-08 | 2017-08-18 | 苏睿 | 图像显示方法、装置和系统、存储介质及处理器 |
CN108939556A (zh) * | 2018-07-27 | 2018-12-07 | 珠海金山网络游戏科技有限公司 | 一种基于游戏平台的截图方法及装置 |
CN109582425B (zh) * | 2018-12-04 | 2020-04-14 | 中山大学 | 一种基于云端与终端gpu融合的gpu服务重定向系统及方法 |
CN110213265A (zh) * | 2019-05-29 | 2019-09-06 | 腾讯科技(深圳)有限公司 | 图像获取方法、装置、服务器及存储介质 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3979589A4 * |
Also Published As
Publication number | Publication date |
---|---|
CN110213265B (zh) | 2021-05-28 |
KR20210104143A (ko) | 2021-08-24 |
US11606436B2 (en) | 2023-03-14 |
CN110213265A (zh) | 2019-09-06 |
JP7297080B2 (ja) | 2023-06-23 |
US20210360086A1 (en) | 2021-11-18 |
EP3979589A1 (en) | 2022-04-06 |
SG11202105855QA (en) | 2021-07-29 |
JP2022534346A (ja) | 2022-07-29 |
EP3979589A4 (en) | 2022-08-10 |
KR102425168B1 (ko) | 2022-07-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020238846A1 (zh) | 图像获取方法、装置、服务器及存储介质 | |
US9549007B2 (en) | User interface widget unit sharing for application user interface distribution | |
WO2019169913A1 (zh) | 一种数据处理的方法、装置、服务器和系统 | |
US8924985B2 (en) | Network based real-time virtual reality input/output system and method for heterogeneous environment | |
CN112791399B (zh) | 云游戏画面的显示方法及装置、系统、介质、电子设备 | |
EP3311565B1 (en) | Low latency application streaming using temporal frame transformation | |
US8819139B2 (en) | Virtual desktop infrastructure (VDI) login acceleration | |
WO2024066828A1 (zh) | 一种数据处理方法、装置、设备、计算机可读存储介质及计算机程序产品 | |
CN108073350A (zh) | 一种用于云渲染的对象存储系统和方法 | |
CN108074210A (zh) | 一种用于云渲染的对象获取系统和方法 | |
CN114237840A (zh) | 资源交互方法、装置、终端及存储介质 | |
CN115794139A (zh) | 镜像数据处理方法、装置、设备以及介质 | |
CN114222003A (zh) | 服务调用方法、系统、装置、设备及存储介质 | |
CN112328356B (zh) | Android与Windows的互通方法、装置、存储介质及计算机设备 | |
CN104038511B (zh) | 一种资源管理方法及装置 | |
US11425219B1 (en) | Smart stream capture | |
CN111111175A (zh) | 一种游戏画面生成方法、装置和移动终端 | |
WO2023035619A1 (zh) | 一种场景渲染方法、装置、设备及系统 | |
CN116310232A (zh) | 数字藏品的数据处理方法、设备、存储介质及程序产品 | |
US10715846B1 (en) | State-based image data stream provisioning | |
CN114785848A (zh) | 电子设备之间的协同交互和协同方法、装置和系统 | |
EP4038545A1 (en) | Freeview video coding | |
KR20170055881A (ko) | 온라인 게임 제공 시스템 및 방법 | |
US20240013461A1 (en) | Interactive Animation Generation | |
KR102503119B1 (ko) | 트리-기반 포인트 클라우드 압축 미디어 스트림을 위한 방법 및 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20813666 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20217023112 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2021548678 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020813666 Country of ref document: EP Effective date: 20220103 |