WO2020238846A1 - 图像获取方法、装置、服务器及存储介质 - Google Patents

图像获取方法、装置、服务器及存储介质 Download PDF

Info

Publication number
WO2020238846A1
WO2020238846A1 PCT/CN2020/092076 CN2020092076W WO2020238846A1 WO 2020238846 A1 WO2020238846 A1 WO 2020238846A1 CN 2020092076 W CN2020092076 W CN 2020092076W WO 2020238846 A1 WO2020238846 A1 WO 2020238846A1
Authority
WO
WIPO (PCT)
Prior art keywords
window
image
image data
user
binary tree
Prior art date
Application number
PCT/CN2020/092076
Other languages
English (en)
French (fr)
Inventor
黄耿星
杨卫
魏雪
王赐烺
于搏睿
刘志伟
杨广东
曾铖
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to SG11202105855QA priority Critical patent/SG11202105855QA/en
Priority to KR1020217023112A priority patent/KR102425168B1/ko
Priority to EP20813666.3A priority patent/EP3979589A4/en
Priority to JP2021548678A priority patent/JP7297080B2/ja
Publication of WO2020238846A1 publication Critical patent/WO2020238846A1/zh
Priority to US17/443,481 priority patent/US11606436B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/363Graphics controllers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/34Network arrangements or protocols for supporting network services or applications involving the movement of software or configuration parameters 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4882Data services, e.g. news ticker for displaying messages, e.g. warnings, reminders
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/355Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an encoded video stream for transmitting to a mobile phone or a thin client
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/209Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform characterized by low level software layer, relating to hardware management, e.g. Operating System, Application Programming Interface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/06Use of more than one graphics processor to process data before displaying to one or more screens
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/02Networking aspects
    • G09G2370/022Centralised management of display operation, e.g. in a server instead of locally
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/02Networking aspects
    • G09G2370/027Arrangements and methods specific for the display of internet documents

Definitions

  • This application relates to the field of Internet technology, and in particular to an image acquisition method, device, server and storage medium.
  • Cloud gaming is an online gaming technology.
  • the game runs on a cloud server.
  • the cloud server renders the game scene as a game screen and transmits it to the user terminal in real time, so that the user of the user terminal can input corresponding operations on the game screen . It can be seen that cloud games can run high-quality games on user terminals with relatively limited graphics processing and data computing capabilities.
  • cloud gaming has become a research hotspot in the gaming field in recent years.
  • One of the key issues involved in the research on cloud games is how the cloud server obtains the game screen of the running game.
  • the current common method to solve this problem is to obtain the screen based on the virtual machine. Specifically: one for each user terminal Virtual machine, the virtual machine renders the game scene to obtain the game screen, and the cloud server directly intercepts the desktop of the virtual machine to obtain the game screen.
  • this method requires a large number of virtual machines to be deployed, which will occupy a large amount of server resources, resulting in a waste of server resources.
  • the embodiments of the application provide an image acquisition method, device, server, and storage medium.
  • the server intercepts window image data generated during the running of a target application program, and synthesizes the window image data to obtain the target
  • the user interface image to be displayed corresponding to the application program saves the resource cost of arranging the virtual machine compared with the manner of obtaining the user interface image through the virtual machine in the prior art.
  • an embodiment of the present application provides an image acquisition method, which is applied to a server, and includes:
  • the notification message carrying the user interface image is sent to the user terminal corresponding to the user identification, so that the user terminal displays the user interface image on the user interface.
  • an embodiment of the present application provides an image acquisition device, which is configured in a server and includes:
  • the obtaining unit is used to obtain the target application process corresponding to the user ID from the application process set;
  • the interception unit is configured to, when it is detected that the image rendering function in the target application process is called, call the data interception module to intercept the currently generated multiple window image data;
  • the first processing unit is configured to perform image synthesis processing on the multiple window image data to obtain a user interface image to be displayed;
  • the sending unit is configured to send a notification message carrying the user interface image to the user terminal corresponding to the user identification, so that the user terminal displays the user interface image on the user interface.
  • an embodiment of the present application provides a server, and the server includes:
  • a computer storage medium that stores one or more instructions, and the one or more instructions are suitable for being loaded by the processor and executing the image acquisition method described above.
  • an embodiment of the present application provides a computer storage medium, the computer storage medium stores one or more instructions, and the one or more instructions are suitable for being loaded by a processor and executing the image acquisition as described above. method.
  • Figure 1 is an example of an image acquisition architecture diagram
  • FIG. 2 is an architecture diagram of image acquisition provided by an embodiment of the present application
  • Figure 3a is a desktop view of a cloud server provided by an embodiment of the present application.
  • FIG. 3b is a schematic diagram of a user interface image provided by an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of an image acquisition method provided by an embodiment of the present application.
  • FIG. 5a is a schematic diagram of obtaining window image data according to an embodiment of the present application.
  • 5b is a schematic diagram of window image data provided by an embodiment of the present application.
  • Fig. 5c is a schematic diagram of another window image data provided by an embodiment of the present application.
  • FIG. 5d is a schematic diagram of another window image data provided by an embodiment of the present application.
  • Fig. 6a is a schematic flowchart of another image acquisition method provided by an embodiment of the present application.
  • FIG. 6b is a schematic flowchart of another image acquisition method provided by an embodiment of the present application.
  • FIG. 6c is a schematic flowchart of another image acquisition method provided by an embodiment of the present application.
  • Figure 7a is a schematic diagram of an image polytree provided by an embodiment of the present application.
  • Fig. 7b is a schematic diagram of an image binary tree provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a data interception module provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of image synthesis processing provided by an embodiment of the present application.
  • Fig. 10a is a desktop view of an application server provided by an embodiment of the present application.
  • FIG. 10b is a schematic diagram of a user interface image provided by an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of an image acquisition device provided by an embodiment of the present application.
  • FIG. 12 is a schematic structural diagram of a server provided by an embodiment of the present application.
  • an architecture diagram of image acquisition may be as shown in FIG.
  • 101 represents an application server
  • 101 can run multiple application processes
  • 102 represents a target application process corresponding to a user ID among multiple application processes
  • 103 represents a user terminal that has logged in the target application with the user ID
  • 104 denotes a virtual machine corresponding to the user terminal 103.
  • the application server needs to provide service support for the operation of the target application, or the application server is used to execute the application process that provides service support for realizing the application function of the target application A collection, where the target application can refer to any application, such as a game application, an instant chat application, and a shopping application.
  • the application server provides service support for the target application based on the service program related to the target application. Specifically, the application server can support the target application to realize application functions by executing the service program. In the computer field, the activity of the application server running the service program is called the application process. If there are multiple user terminals running target applications at the same time, in order to ensure that the operation of the target applications in each user terminal does not interfere with each other, the application server creates an application process for each user to support the operation of their respective target applications. That is, the application server can execute a set of application processes that provide service support for realizing the application functions of the target application. The set of application processes includes multiple application processes, and each application process corresponds to a user ID.
  • the target application process described in the embodiment of the present application may refer to an application process corresponding to any user identifier.
  • the application server detects that user A and user B have After the operation instruction, an application process A and an application process B are respectively created for the user A and the user B to support the player experience game application of the two user terminals.
  • the application server 101 receives an operation instruction for the target application sent by the user terminal 103, the application server 101 notifies the virtual machine 104 corresponding to the user ID in the user terminal 103 to run the target Application process 102, other virtual machines run application process 1 and application process 2 respectively.
  • the application server 101 obtains the user interface image corresponding to the user identification by running the target application on the virtual machine 104.
  • the method of obtaining the user interface image corresponding to the user identification through the virtual machine may be: the virtual machine 104 calls The image rendering function of the target application process 102 performs rendering to obtain the rendered window image data; then the virtual machine 104 synthesizes the window image data and other parameters to obtain the image to be displayed; it is displayed in the user interface of the virtual machine 104 The image to be displayed; the application server 101 intercepts the current user interface of the virtual machine 104, uses the intercepted image as a user interface image corresponding to the user identification, and sends the user interface image to the user terminal 103. In this way, the user identified by the user identifier can view the user interface image corresponding to the target application process 102 through the user terminal 103.
  • the image acquisition architecture diagram may also be as shown in FIG. 2.
  • 201 denotes an application server, and the application server 201 runs a target application program.
  • the set of application processes supported by the application function providing service; 202 represents the target application process corresponding to the user ID in the application process set; 203 represents the user terminal that logs in the target application with the user ID, that is, the user corresponding to the user ID terminal.
  • each application process in the application process set is also configured with a data interception module, for example, a data interception module 2021 is configured in the target application process, a data interception module configured in the application process 1, and in the application process The data interception module configured in 2.
  • the application server 201 may call the data interception module 2021 in the target application process 202 to capture window image data.
  • the window image data is executed during the execution of the target application process 202.
  • the image rendering function in the image rendering function is obtained during rendering; then image synthesis is performed on the window image data to obtain the user interface image to be displayed.
  • a picture synthesizer 204 may be configured in the application server 201, and the application server 201 may intercept the calling data interception module and send the window image data to the picture synthesizer 204 to call the picture synthesizer 204 to perform image synthesis on the window image data.
  • the user interface image to be displayed is obtained by processing. After obtaining the user interface image, the application server 201 sends a notification message carrying the user interface image to the user terminal 203, so that the user terminal 203 can display the user interface image on the user interface.
  • each application process runs on a corresponding virtual machine instead of running on an application server, resulting in low operating efficiency of the application process.
  • an embodiment of the present application provides an image acquisition method, which can be applied to an application server, and the application server is used to execute applications for achieving the target application level A collection of application processes supported by the function providing service, such as the application server 101 in FIG. 1 or the application server 201 in FIG. 2.
  • the image acquisition method described in the embodiment of the application the user interface image corresponding to a certain user ID can be obtained, and the obtained user interface image can be carried in a notification message and sent to the user terminal corresponding to the user ID to instruct the user
  • the terminal displays the user interface image on the user interface. In this way, the user identified by the user identification can operate the target application through the user interface image.
  • the embodiment of the present application saves the resource overhead of arranging the virtual machine in the application server.
  • the image acquisition method described in the embodiments of this application can be applied to the application scenario of cloud games.
  • the so-called cloud game refers to: games corresponding to multiple user identities are run on a cloud server, and the cloud server will After the game scene in the game is rendered as a game animation, it is transmitted to the user terminal corresponding to each user identifier in real time, so that the user identified by each user identifier can input corresponding operations on the game screen.
  • FIG. 3a is a desktop view of a cloud server provided by an embodiment of this application.
  • the cloud server is the application server 101 in FIG. 1 or the application server 201 in FIG. 2, and 300 in FIG. desktop.
  • multiple application processes supporting game running are running on the cloud server at the same time.
  • the cloud server can acquire game screens corresponding to multiple application processes.
  • the desktop view can display game screens such as 301, 302, and 303 corresponding to each application process at the same time.
  • the game screen corresponding to each application is the game screen corresponding to each user ID.
  • 301 indicates a game screen corresponding to user identification A
  • 302 indicates a game screen corresponding to user identification B
  • 303 indicates a game screen corresponding to user identification C.
  • the game screen corresponding to each user identification can be transmitted to the corresponding user terminal for display.
  • the game screen of the user terminal The game screen seen in the user interface is the game screen corresponding to the user ID in the cloud server, and game screens corresponding to other user IDs will not be displayed. That is, the games corresponding to each user will not affect each other.
  • the user interface image of the user terminal corresponding to the user ID A among multiple user IDs For example, the user terminal is the user terminal 103 in Fig. 1 and the user terminal 203 in Fig. 2. It can be seen that the user ID A Only the game screen 301 corresponding to the user identification A is displayed in the user interface of the corresponding user terminal.
  • an embodiment of the present application provides a schematic flowchart of an image acquisition method, as shown in FIG. 4.
  • the image acquisition method described in FIG. 4 may be executed by a server, and specifically may be executed by a processor of the server.
  • the application server when applied to the application server 101 shown in FIG. 1 or the application server 201 shown in FIG. 2, the application server is used to execute a collection of application processes that provide service support for realizing the application functions of the target application.
  • the image acquisition method described in FIG. 4 may include the following steps:
  • the user identification is used to indicate the identity of the user, and the user identification may include any one or more of the user's ID card, the user's mobile phone number, and the user's login account.
  • the acquisition of the target application process corresponding to the user identifier may be executed by the application server after receiving the user operation request sent by the user terminal.
  • the user terminal refers to a terminal device that has logged in based on a user ID.
  • the terminal device may include any device capable of video decompression, such as a mobile phone, a tablet, or a desktop computer; the user operation request may include a target application Any one or more of login operations, window switching operations, and logout operations.
  • the user operation request sent by the user received by the user terminal at this time may refer to the login operation; if the user terminal has logged in based on the user ID For the target application, the user operation message at this time may refer to a logout operation, or a window switching operation on each window included in the target application, or a request to run a certain application function of the target application, etc.
  • Fig. 5a is a schematic diagram of obtaining window image data according to an embodiment of the present application.
  • the obtaining method may be executed by a server.
  • the application server 101 in Fig. 1 or the application server 201 in Fig. 2 includes the following steps:
  • S4023 Call the data interception module to obtain window image data generated by the image rendering function for each window.
  • image rendering refers to a process of converting a three-dimensional light energy transfer process into a two-dimensional image
  • an image rendering function refers to a function that can implement an image rendering function
  • OpenGL Open Graphics Library
  • GDI Graphics Device Interface
  • DirectX programming interface
  • OpenGL is a low-level 3D graphics library, which only provides rendering functions and can be transplanted between different platforms
  • DirectX mainly requires hardware support for GPU design, and DirectX is more suitable for image rendering of game applications
  • GDI is designed for broad spectrum , Can also be used in the image rendering of game applications, but the effect is not as good as DirectX.
  • the rendering mode of each window in the image rendering function of the target application process may be different.
  • the image rendering function includes The rendering mode can be GDI; when rendering the second window of the target application, the rendering mode included in the image rendering function can be OpenGL.
  • the game application may include three windows, which are "homepages” respectively Window, "Store” window and "Video” window. Each window corresponds to a window image data. Specifically, the "Home” window image data is shown in Figure 5b, and the "Store” window image data is shown in Figure 5c. The window image data is shown in Figure 5d.
  • S403 Perform image synthesis processing on multiple window image data to obtain a user interface image to be displayed.
  • the image synthesis processing may refer to sequentially superimposing each window image data according to the synthesis sequence of each window image data.
  • the synthesis order may refer to the order in which each window image data is processed in the image synthesis process, and the window image data with the earlier synthesis order is processed earlier in the image synthesis process.
  • the synthesis sequence may be determined according to the window level identifier corresponding to each window image data.
  • the window level identifier is used to indicate the level of the window. The higher the level indicated by the window level identifier, the lower the synthesis order of the window image data.
  • the window level identifiers of the three window image data are "video", "store”, and "homepage” in descending order, and the higher window image is identified according to the above window level
  • the synthesis order of the image data of the three windows in Figure 5b-5d can be "homepage", "store” and "video”; if the three window data are synthesized according to the synthesis order,
  • the obtained user interface image to be displayed can be as shown in Figure 3b.
  • the application server may call a screen synthesizer to synthesize the image data of each window.
  • S404 Send the notification message carrying the user interface image to the user terminal corresponding to the user identification, so that the user terminal displays the user interface image on the user interface.
  • the notification message is used to instruct the user terminal to display a user interface image on the user interface.
  • a schematic diagram of the user terminal displaying the user interface image on the user interface may be shown in FIG. 3b.
  • the application server in the embodiment of the present application intercepts the window image data obtained by calling the rendering function during the execution of the target application, and then performs image synthesis on the window image data
  • the user interface image is processed, which can save the resource overhead of the application server.
  • FIG. 6a is a schematic flowchart of another image acquisition method provided by an embodiment of the present application.
  • the image acquisition method described in FIG. 6a may be executed by a server, and specifically may be executed by a processor of the server. For example, it can be applied to the application server 101 shown in FIG. 1 or the application server 201 shown in FIG. 2, and the application server is used to execute a set of application processes that provide service support for realizing the application functions of the target application.
  • the image acquisition method described in FIG. 6a may include the following steps:
  • S601 Obtain a target application process corresponding to a user identifier from the set of application processes.
  • This step is the same as the above step 401, and will not be repeated here.
  • Each window image data is obtained by executing one or more image rendering functions of the target application.
  • the rendering method included in each image rendering function can be any of OpenGL, GDI, and DirectX. .
  • S603 According to the rendering mode, determine a data interception sub-module corresponding to the rendering mode in the data interception module.
  • the data interception module may include three interception sub-modules, and the three interception sub-modules respectively correspond to the above three rendering modes.
  • the data interception module includes three interception sub-modules, which may be OpenGL interception sub-modules. Module, GDI interception submodule and DirectX interception submodule.
  • the correspondence between each interception submodule and the rendering mode may be: the rendering mode corresponding to the OpenGL interception submodule is OpenGL, the rendering mode corresponding to the GDI interception submodule is GDI, and the rendering mode corresponding to the DirectX interception submodule is DirectX. It should be understood that calling the interception submodule corresponding to the rendering mode to obtain window image data can ensure the accuracy of the intercepted window image data.
  • the schematic diagram of the data interception module provided by the embodiment of the present application may be as shown in FIG. 8.
  • the data interception module shown in FIG. 8 may include a GDI interception sub-module, a DirectX interception sub-module, and an OpenGL interception sub-module.
  • S604 Call the data interception sub-module to intercept multiple window image data obtained when the image rendering function is executed.
  • step S605 While performing steps S602-S604, the following step S605 may also be performed:
  • S605 Obtain screen display parameters set for the target application process, and determine the window level identifier of each window image data according to the screen display parameters.
  • the screen display parameters may include any one or more of whether the window corresponding to each window image data is minimized, maximized, and the arrangement relationship between the windows, such as The arrangement relationship can be that window A is placed before window B; window C is placed after window B, and so on.
  • the screen display parameters may be stored in a sandbox.
  • the sandbox described in this embodiment of the application is a mechanism that can isolate multiple application processes running on the application server from each other.
  • the screen display parameters set for the target application process stored in the sandbox may be parameters set by default for the target application process by the application server.
  • the parameters of the target application process settings supported by the game function of the game application may include: the window corresponding to the "home page” window data is placed in the front, and the window corresponding to the image data of the "shop” window is placed in the window corresponding to the image data of the "home” window.
  • the window corresponding to the image data of the "Video” window is placed behind the window corresponding to the image data of the "Store”; at this time, the screen display parameters can be: the image data of the "Home” window is displayed at the top, and the image data of the "Store” window is displayed After the “Home” window image data, the "Video” window image data is placed after the "Shop” window image data.
  • the screen display parameters set for the target application process stored in the sandbox may be determined according to the user operation request sent by the user terminal and the default parameters set by the application server for the target application process.
  • the parameters set by the target application process for providing service support for realizing the game functions of the game application may include: the image data of the "homepage" window corresponding to the first display, and the image of the "shop” window The window corresponding to the data is placed behind the window corresponding to the image data of the "Home" window, and the window corresponding to the image data of the "Video” window is placed behind the window corresponding to the image data of the "Shop” window; it is detected that the user operation request includes the "Home” The window corresponding to the window image data is switched to the window corresponding to the "video” window image data.
  • the screen display parameters determined according to the user operation request and the parameters set by the application server for the application by default may include: The window is displayed at the top, the window corresponding to the image data of the "Shop” window is placed behind the window corresponding to the image data of the "Home” window, and the window corresponding to the image data of the "Home” window is placed behind the window corresponding to the image data of the "Video” window .
  • the window level identifier of each window image data can be determined according to the screen display parameters.
  • the window level identifier is used to uniquely indicate the corresponding window image data, and the window level identifier Indicates the hierarchical relationship between window image data.
  • the window level identifier can be represented by one or more numbers, such as 1, 2, 3, or (1, 2), (2, 1), (3, 1), etc.; or the window level can also be represented by Other forms of expression.
  • a number is used to represent a window level identifier, if the number of numbers used to represent each window level identifier is not the same, the higher the number of numbers, the higher the window level identifier, for example (1, 2) represents The window level identifier of is higher than the window level identifier represented by 1.
  • N is any positive integer greater than 1, and the first N-1 numbers are the same, and the first If the N numbers are not the same, the smaller the sum of each number, the higher the window level identifier, for example, the window level identifier represented by (1, 2) is higher than the window level identifier represented by (1, 5).
  • the window corresponding to the window level identifier (1, n) (n is any positive integer greater than or equal to 1) is a child window of the window corresponding to the window level identifier (1); the window corresponding to the window level identifier (1, n, m)
  • the window is a child window of the window corresponding to the window level identifier (1, n). Based on the foregoing description, it can be known that the window level identifier corresponding to the child window is higher than the window level identifier corresponding to its parent window.
  • S606 Perform image synthesis processing on multiple window image data according to the window level identification to obtain a user interface image to be displayed.
  • the image synthesis processing is performed on the image data of the multiple windows according to the order of the hierarchical relationship from high to low.
  • S607 Send the notification message carrying the user interface image to the user terminal corresponding to the user identifier, so that the user terminal displays the user interface image on the user interface.
  • This step is the same as the above step S404, and will not be repeated here.
  • FIG. 6b is a schematic flowchart of another image acquisition method provided by an embodiment of the present application.
  • the image acquisition method described in FIG. 6b may be executed by a server, and specifically may be executed by a processor of the server. For example, it can be applied to the application server 101 shown in FIG. 1 or the application server 201 shown in FIG. 2.
  • the application server is used to execute a set of application processes that provide service support for realizing the application functions of the target application, as shown in FIG. 6b.
  • the image acquisition method described may include the following steps:
  • S701 Obtain a target application process corresponding to the user identifier from the application process set.
  • This step is the same as the above step 401, and will not be repeated here.
  • S702 Obtain screen display parameters set for the target application process, and determine the window level identifier of each window image data according to the screen display parameters.
  • This step is the same as the above step 605 and will not be repeated here.
  • S703 Generate an image binary tree according to the window level identifier of the image data of each window.
  • an image binary tree is generated according to the window level identifier of each window image data.
  • the image binary tree may include multiple binary tree nodes, and each binary tree node stores a window level identifier.
  • the generating an image binary tree according to the window level identifiers of the respective window image data may include:
  • S7031 Construct an image polytree according to the window level identifiers of the image data of each window.
  • the image polytree includes N levels, and each level includes M nodes, where N and M are both positive integers greater than or equal to 1, and two adjacent levels in the image polytree include parent and child
  • the parent node and child node of the relationship, each node having the same parent node in the same level of the image polytree is a sibling node;
  • S7032 Convert the image polytree into an image binary tree.
  • the first node in each level of the image polytree is converted into the left node of the image binary tree, and other nodes that are siblings with the first node are converted into the right node of the image binary tree .
  • an image multi-tree constructed according to the window level identification of each window image data is as shown in Figure 7a.
  • the image binary tree includes 3 levels, as shown in 701-703.
  • the numbers in the nodes of each level represent the node
  • the window level identifier of the corresponding window image data can be seen from the figure.
  • a node is included in the 701 level.
  • the nodes included in the 701 level may be referred to as root nodes.
  • the root node in this embodiment of the application is the desktop image data of the user terminal.
  • the desktop image data of the user terminal may be sent by the application server after the user terminal receives the selection operation of the desktop image data input by the user; there are three nodes in the 702 level, and three nodes in the 703.
  • the windows corresponding to the window level identifiers (1, 1) and (1, 2) in the 702 level are child windows of the window corresponding to the window level identifier 1, so the window level identifiers (1, 1) and (1, 2)
  • the parent nodes of the represented child nodes are all nodes represented by the window level identifier 1, so the nodes represented by the window level identifiers (1, 1) and (1, 2) are sibling nodes.
  • the window level identifiers 1, 2, and 3 represent each other's sibling nodes.
  • Converting the image polytree shown in Figure 7a into an image binary tree refers to: converting the first node in each level 701-703 in the image polytree into the left node of the image binary tree, for example, the first node in the 701 level
  • the node is the node represented by the window level identifier 0. This node is regarded as the left node of the image binary tree. If other nodes are not included in the 701 level, the conversion to the 702 level is continued; in the 702 level, the first node is the window level identifier 1. Representative node, continue to use this node as the left node of the image binary tree.
  • the 702 level also includes two sibling nodes of the node represented by window level identifier 1, which are the node represented by window level identifier 2 and the node represented by window level identifier 3. , Regard these two nodes as the right node of the image binary tree in turn.
  • the node represented by window level identifier 2 and the node represented by window level identifier 3 as the right node of the image binary tree in turn can be understood as: the node represented by window level identifier 2 is taken as the right node of the node represented by window level identifier 1.
  • the node represented by the window level identifier 3 serves as the right node of the node represented by the window level identifier 2.
  • S704 Traverse the image binary tree according to the preset first traversal order, to obtain the synthesis order of the image data of each window.
  • the preset first (ie, binary tree image synthesis) traversal order may refer to the preorder traversal, and the preorder traversal refers to: root node-left node-right node.
  • the result of the preorder traversal in the image binary tree shown in Fig. 7b is: 0-1-(1,1)-(1,2)-2-3-(3,1).
  • the image binary tree is traversed according to the preset binary tree display traversal sequence, and the synthesis sequence of each window image data in the window image data is obtained 0-1-(1,1)-(1,2)-2-3-(3) ,1).
  • S705 Traverse the image binary tree according to the preset second traversal order to obtain the image capturing order of each window image data, and call the data interception module to intercept and obtain multiple window image data according to the image capturing order.
  • the preset second (binary image acquisition) traversal order may refer to the post-order traversal, and the traversal order of the post-order traversal is: left node-right node-root node.
  • the result of the post-order traversal in the image binary tree shown in Figure 7b is: (1,1)-(1,2)-(3,1)-3-2-1-0.
  • the image binary tree is traversed according to the preset traversal order of the binary tree image acquisition, and the image capturing order of each window image data in the window image data is obtained as (1, 1)-(1, 2)-(3, 1) -3-2-1-0.
  • the window image data may be subjected to image synthesis processing to obtain the user interface image to be displayed.
  • S706 Perform image synthesis processing on each window image data according to the synthesis sequence to obtain a user interface image to be displayed.
  • step 706 may be: allocate a sub-canvas for each window image data, place the corresponding window image data on the corresponding sub-canvas, and arrange each sub-canvas according to the above-mentioned synthesis sequence Combine to get the user interface image to be displayed.
  • the combination is performed in accordance with the above-mentioned synthesis sequence to realize that the window image data with the higher window level identifier is combined later.
  • the window image data with the higher window level identifier in the obtained user interface image will cover the window level. Identifies low window image data.
  • S707 Send the notification message carrying the user interface image to the user terminal corresponding to the user identification.
  • This step is the same as the above step 404, and will not be repeated here.
  • FIG. 6c is a schematic flowchart of another image acquisition method provided by an embodiment of the present application.
  • the image acquisition method may be executed by a server, and specifically may be executed by a processor of the server.
  • it can be applied to the application server 101 shown in FIG. 1 or the application server 201 shown in FIG. 2.
  • Fig. 6c on the basis of the embodiment described in Fig. 6b, after step S702, it is also possible to perform:
  • S708 Generate a message response binary tree according to the window level identifier of each window image data.
  • the message response binary tree includes multiple message nodes, and each message node records window position information and window size information corresponding to corresponding window image data.
  • the same method as step S703 can be used.
  • the message response binary tree and the image binary tree have the same structure.
  • the message binary tree generated according to the window level identifier of each window image data may also be as shown in FIG. 7b.
  • Each node of the image binary tree stores the window level identifier of the window image data. Unlike the image binary tree, each node in the message response binary tree not only stores the window level identifier of the window image data, but also stores the window corresponding to each window image data.
  • Position information and window size information may include the coordinates of the window, and the coordinates of the window may be expressed in the form of pixel coordinates, such as (3 pixels, 6 pixels), or expressed in the form of world coordinates, such as (10, 15) ;
  • the window size information may include information such as the width and height of the window.
  • S709 Determine display position information of each window image data based on the message response binary tree.
  • the display position information may include any one or two of window position information and window size information of the window image data.
  • S710 Perform synthesis processing on the image data of each window according to the display position information and the synthesis sequence, to obtain a user interface image to be displayed.
  • the embodiment of the present application provides a schematic diagram of image synthesis processing as shown in FIG. 9.
  • the data acquisition module is used to acquire the image data of each window in the target application process.
  • Binary tree and message response binary tree perform image synthesis processing on the image data of each window.
  • an implementation manner of determining the display position information of each window image data based on the message response binary tree may include:
  • the application server can obtain the window level identifier of the image data of the first window according to the image binary tree; then search the window position information and window size information corresponding to the window level identifier of the first window data from the message response binary tree, and find the window position information And the window size information is determined as the display position information of the first window image data.
  • the message response binary tree in addition to being applied to the image synthesis process, can also be used to find the target window image data corresponding to the user operation message, so that the application server can process the target window image data in response to the user operation message .
  • the user operation message is an operation related to the user interface image input by the user through the user terminal.
  • Figure 6c also includes the following steps:
  • S711 Receive a user operation message sent by the user terminal, and obtain operation location information from the user operation message.
  • the operation position information included in the user operation message refers to the position information at the position of the user input operation, for example, the user clicks, double-clicks, or long presses a certain position in the user interface image to determine the position information of this position Is the operation location information.
  • S712 Traverse the message response binary tree according to the preset third traversal sequence, and determine the message node whose operation location information meets the preset location condition as the target message node.
  • the target message node is the first message node found in the message response binary tree that meets a preset location condition, where the operation location information meeting the preset location condition means that the operation location information is in the Within the range defined by the window position information and window size information recorded by the target message node;
  • S713 Use window image data corresponding to the target message node as target window image data, and process the window corresponding to the target window image data in response to the user operation message.
  • the application server runs application processes corresponding to multiple user identities, and the application server displays user interface images corresponding to multiple user identities
  • the user interface images corresponding to different user identities may overlap as shown in Figure 10a, in Figure 10a
  • a and B represent user interface images corresponding to different user IDs; there may also be overlaps between the image data of each window in the user interface image corresponding to the same user ID, as shown in Figure 10b, in Figure 10b, A and B represent the same user ID corresponding to Two window image data in the user interface image.
  • Different user IDs correspond to different application processes, and the screen display parameters corresponding to different application processes are different. Because the image binary tree and message binary tree are generated according to the window level identification determined by the screen display parameters, the image binary tree and message response between different application processes The binary trees are also different, so it can be understood that the message responses corresponding to different user identifiers are different.
  • the application server needs to know which user interface image the user operation message input by the user is for.
  • the binary tree of message responses corresponding to different user identities may be used to determine which user interface image the currently received user operation message is for.
  • the application server needs to know which window image data in the user interface image the user operation message is for.
  • the message response binary tree may be traversed according to a preset third (ie binary tree window search) traversal sequence to determine the target message node, and then determine the target window image data corresponding to the target node.
  • the preset third traversal order may refer to middle-order traversal, and the middle-order traversal refers to traversing the binary tree in the order of: left node-root node-right node. For example, suppose that the message response binary tree is shown in Figure 7b, and the middle order traversal result is: (1,1)-(1,2)-1-2-(3,1)-(3)-0.
  • an embodiment of the present application also discloses an image acquisition device, which can be applied to a server, for example, the application server 101 in FIG. 1 or the application server 201 in FIG. 2,
  • the application server is used to execute a set of application processes that provide service support for realizing the application functions of the target application, and the image acquisition device can execute the methods shown in FIGS. 4 and 6a to 6c.
  • the image acquisition device 1100 may run the following units:
  • the obtaining unit 1101 is configured to obtain the target application process corresponding to the user identifier from the application process set;
  • the interception unit 1102 is configured to, when it is detected that the image rendering function in the target application process acquired by the acquisition unit 1101 is called, call the data interception module to intercept the currently generated multiple window image data;
  • the first processing unit 1103 is configured to perform image synthesis processing on the multiple window image data intercepted by the intercepting unit 1102 to obtain a user interface image to be displayed;
  • the sending unit 1104 is configured to send a notification message carrying the user interface image obtained by the first processing unit 1103 to the user terminal corresponding to the user identification, so that the user terminal displays the user interface image on the user interface .
  • the obtaining unit 1101 is further configured to obtain the screen display parameters set for the target application process
  • the device 1100 also includes:
  • the determining unit 1105 is configured to determine the window level identifier of each window image data according to the screen display parameters acquired by the acquiring unit 1102;
  • the first processing unit 1103 is further configured to perform image synthesis processing on the multiple window image data according to the window level identifier determined by the determining unit 1105.
  • the apparatus 1100 further includes:
  • the first generating unit 1106 is configured to generate an image binary tree according to the window level identifier of each window image data determined by the determining unit 1105;
  • the first processing unit 1103 is further configured to traverse the binary tree of images generated by the first generating unit 1106 according to the preset first traversal order to obtain the synthesis order of the image data of each window;
  • the multiple window image data are processed for image synthesis.
  • the first generating unit 1106 is configured to construct an image multi-tree according to the window level identifiers of the image data of each window determined by the determining unit 1105; and convert the image multi-tree to the image binary tree .
  • the apparatus 1100 further includes:
  • the first generating unit 1106 is configured to generate an image binary tree according to the window level identifier of each window image data determined by the determining unit 1105;
  • the intercepting unit 1102 is configured to traverse the binary tree of images generated by the first generating unit 1106 according to the preset second traversal order to obtain the image capturing order of each window image data; according to the image capturing order, call the The data interception module intercepts and obtains the multiple window image data.
  • the interception unit 1102 is configured to determine the rendering mode included in the image rendering function; according to the rendering mode, determine the data interception submodule in the data interception module corresponding to the rendering mode; call The data interception sub-module intercepts multiple window image data obtained when the image rendering function is executed.
  • the apparatus 1100 further includes:
  • the second generating unit 1107 is configured to generate a message response binary tree according to the window level identifier of each window image data determined by the determining unit 1105;
  • the first processing unit 1103 is configured to determine the display position information of each window image data based on the message response binary tree generated by the second generation unit 1107; according to the display position information and the synthesis sequence, perform the Image data is synthesized.
  • the message response binary tree includes multiple message nodes
  • the obtaining unit 1101 is further configured to receive a user operation message sent by the user terminal, and obtain operation position information from the user operation message;
  • the device 1100 also includes:
  • the second processing unit 1108 is configured to traverse the message response binary tree generated by the second generating unit 1107 according to the preset third traversal sequence, and convert the message nodes whose operating position information obtained by the obtaining unit 1101 meets the preset position conditions , Determine it as a target message node; take the window image data corresponding to the target message node as target window image data, and process the window corresponding to the target window image data in response to the user operation message.
  • the units in the image acquisition device shown in FIG. 11 can be separately or completely combined into one or several other units to form, or some of the units can be disassembled. It is composed of multiple units with smaller functions, which can achieve the same operation without affecting the realization of the technical effects of the embodiments of the present application.
  • the above-mentioned units are divided based on logical functions.
  • the function of one unit may also be realized by multiple units, or the functions of multiple units may be realized by one unit.
  • the image-based acquisition device may also include other units. In practical applications, these functions may also be implemented with the assistance of other units, and may be implemented by multiple units in cooperation.
  • a general-purpose computing device such as a computer including a central processing unit (CPU), a random access storage medium (RAM), a read-only storage medium (ROM) and other processing elements and storage elements
  • CPU central processing unit
  • RAM random access storage medium
  • ROM read-only storage medium
  • Run a computer program (including program code) capable of executing the steps involved in the corresponding method shown in FIG. 4 or FIG. 6a-6c to construct the image acquisition device shown in FIG. 11, and to implement the application Example image acquisition method.
  • the computer program can be recorded on, for example, a computer-readable recording medium, and loaded into the aforementioned computing device via the computer-readable recording medium, and run in it.
  • an embodiment of the present application further provides a server.
  • the server may be the application server 101 shown in FIG. 1 or the application server 201 shown in FIG. Used to execute a collection of application processes that provide service support for realizing the application functions of the target application.
  • the server includes at least a processor 1201, a network interface 1202, and a computer storage medium 1203.
  • the network interface 1202 is used to receive or send data when connecting to the network.
  • the computer storage medium 1203 may be stored in the memory of the server.
  • the computer storage medium 1203 is used to store a computer program, the computer program includes program instructions, and the processor 1201 is used to execute the program instructions stored in the computer storage medium 1203. .
  • the processor 1201 or CPU Central Processing Unit
  • the processor 1201 or CPU is the computing core and control core of the terminal. It is suitable for implementing one or more instructions, specifically suitable for loading and executing one or more instructions to implement the corresponding method Process or corresponding function; in one embodiment, the processor 1201 described in the embodiment of the present application may be used to execute the image acquisition method described in FIGS. 4 and 6a-6c.
  • the embodiment of the present application also provides a computer storage medium (Memory).
  • the computer storage medium is a memory device in a terminal for storing programs and data. It is understandable that the computer storage medium herein may include a built-in storage medium in the terminal, and of course, may also include an extended storage medium supported by the terminal.
  • the computer storage medium provides storage space, and the storage space stores the operating system of the terminal.
  • one or more instructions suitable for being loaded and executed by the processor 1201 are stored in the storage space, and these instructions may be one or more computer programs (including program codes).
  • the computer storage medium here may be a high-speed RAM memory, or a non-volatile memory (non-volatile memory), such as at least one disk memory; optionally, it may also be at least one located far away from the aforementioned processor.
  • Computer storage media may be a high-speed RAM memory, or a non-volatile memory (non-volatile memory), such as at least one disk memory; optionally, it may also be at least one located far away from the aforementioned processor.
  • the processor 1201 can load and execute one or more instructions stored in the computer storage medium to implement the corresponding steps of the method in the above-mentioned image acquisition method embodiment; in a specific implementation, the computer storage medium One or more instructions are loaded by the processor 1201 and execute the image acquisition methods described in FIGS. 4 and 6a-6c.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Generation (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Digital Computer Display Output (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

本申请实施例公开了一种图像获取方法、装置、服务器及存储介质,其中方法包括:从应用进程集合中获取用户标识对应的目标应用进程;当检测到所述目标应用进程中的图像渲染函数被调用时,调用数据截取模块截取当前生成的多个窗口图像数据;对所述多个窗口图像数据进行图像合成处理,得到待显示的用户界面图像;将携带所述用户界面图像的通知消息发送给所述用户标识对应的用户终端,以使所述用户终端在用户界面上显示所述用户界面图像。

Description

图像获取方法、装置、服务器及存储介质
本申请要求于2019年5月29日提交中国专利局、申请号为201910468135.3、申请名称为“图像获取方法、装置、服务器及存储介质”的中国专利申请的优先权。
技术领域
本申请涉及互联网技术领域,尤其涉及一种图像获取方法、装置、服务器及存储介质。
发明背景
云游戏是一种在线游戏技术,在云游戏场景中,游戏运行在云端服务器,由云端服务器将游戏场景渲染为游戏画面后实时传输到用户终端,以供用户终端的用户针对游戏画面输入相应操作。由此可见,云游戏可实现在图形处理与数据运算能力相对有限的用户终端上运行高品质游戏。
由于上述优点,云游戏成为近年来游戏领域中的研究热点。在对云游戏的研究中涉及的关键问题之一是云端服务器如何获取到正在运行游戏的游戏画面,目前解决该问题常用的方法是基于虚拟机获取画面,具体地:为每个用户终端布置一个虚拟机,虚拟机对游戏场景渲染得到游戏画面,云端服务器直接截取虚拟机的桌面即可获取到游戏画面。但是该种方法需要大量部署虚拟机,这样会占用大量的服务器资源,从而导致服务器资源的浪费。
发明内容
本申请实施例提供了一种图像获取方法、装置、服务器及存储介质,服务器截取目标应用程序运行过程中产生的窗口图像数据,并通过对所述窗口图像数据进行合成处理,便可得到与目标应用程序对应的待显示的用户界面图像,与现有技术中通过虚拟机获取用户界面图像的方式相比,节省了布置虚拟机的资源开销。
一方面,本申请实施例提供了一种图像获取方法,所述方法应用于服务器,包括:
从应用进程集合中获取用户标识对应的目标应用进程;
当检测到所述目标应用进程中的图像渲染函数被调用时,调用数据截取模块截取当前生成的多个窗口图像数据;
对所述多个窗口图像数据进行图像合成处理,得到待显示的用户界面图像;
将携带所述用户界面图像的通知消息发送给所述用户标识对应的用户终端,以使所述用户终端在用户界面上显示所述用户界面图像。
另一方面,本申请实施例提供了一种图像获取装置,所述装置配置于服务器中,包括:
获取单元,用于从应用进程集合中获取用户标识对应的目标应用进程;
截取单元,用于当检测到所述目标应用进程中的图像渲染函数被调用时,调用数据截取模块截取当前生成的多个窗口图像数据;
第一处理单元,用于对所述多个窗口图像数据进行图像合成处理,得到待显示的用户界面图像;
发送单元,用于将携带所述用户界面图像的通知消息发送给所述用户标识对应的用户终端,以使所述用户终端在用户界面上显示所述用户界面图像。
再一方面,本申请实施例提供了一种服务器,所述服务器包括:
处理器,适于实现一条或多条指令;以及,
计算机存储介质,所述计算机存储介质存储有一条或多条指令,所述一条或多条指令适于由所述处理器加载并执行如上所述的图像获取方法。
再一方面,本申请实施例提供了一种计算机存储介质,所述计算机存储介质存储有一条或多条指令,所述一条或多条指令适于由处理器加载并执行如上所述的图像获取方法。
附图简要说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是一种图像获取的架构图示例;
图2是本申请实施例提供的一种图像获取的架构图;
图3a是本申请实施例提供的一种云端服务器的桌面视图;
图3b是本申请实施例提供的一种用户界面图像的示意图;
图4是本申请实施例提供的一种的图像获取方法的流程示意图;
图5a是本申请实施例提供的一种获取窗口图像数据的示意图;
图5b是本申请实施例提供的一种窗口图像数据的示意图;
图5c是本申请实施例提供的另一种窗口图像数据的示意图;
图5d是本申请实施例提供的又一种窗口图像数据的示意图;
图6a是本申请实施例提供的另一种图像获取方法的流程示意图;
图6b是本申请实施例提供的又一种图像获取方法的流程示意图;
图6c是本申请实施例提供的另一种图像获取方法的流程示意图;
图7a是本申请实施例提供的一种图像多叉树的示意图;
图7b是本申请实施例提供的一种图像二叉树的示意图;
图8是本申请实施例提供的一种数据截取模块的示意图;
图9是本申请实施例提供的一种图像合成处理的示意图;
图10a是本申请实施例提供的一种应用服务器的桌面视图;
图10b是本申请实施例提供的一种用户界面图像的示意图;
图11是本申请实施例提供的一种图像获取装置的结构示意图;
图12是本申请实施例提供的一种服务器的结构示意图。
实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。
在用户终端从服务器获取图像的实现方式中,一种图像获取的架构图可以如图1所示。在图1中101表示应用服务器,101中可运行多个应用进程,102表示多个应用进程中与用户标识对应的目标应用进程,103表示以所述用户标识登录了目标 应用程序的用户终端,104表示与用户终端103对应的虚拟机。应当理解的,为了保证用户终端中的目标应用程序正常运行,应用服务器需要为目标应用程序的运行提供服务支持,或者说应用服务器用于执行为实现目标应用程序的应用功能提供服务支持的应用进程集合,其中,所述目标应用程序可以指任意的应用程序,比如游戏应用程序、即时聊天应用程序以及购物应用程序等。
应用服务器为目标应用程序提供服务支持是基于与目标应用程序有关的服务程序实现的。具体地,应用服务器可通过执行服务程序来支持目标应用程序实现应用功能,在计算机领域中,应用服务器运行服务程序的活动称为应用进程。如果有多个用户终端中同时运行目标应用程序,为了保证各个用户终端中的目标应用程序的运行互不干扰,应用服务器为每一个用户创建一个应用进程来支持各自的目标应用程序的运行。也即应用服务器中可以执行有为实现目标应用程序的应用功能提供服务支持的应用进程集合,应用进程集合中包括多个应用进程,每个应用进程对应一个用户标识。本申请实施例所述的目标应用进程可以指任意一个用户标识对应的应用进程。
例如,在云游戏应用场景中,假设目标应用程序为游戏应用程序,如果用户A和用户B通过不同的用户终端同时启动了游戏应用程序,应用服务器检测到用户A和用户B对游戏应用程序的操作指令后,为用户A和用户B分别创建应用进程A和应用进程B来支持两个用户终端的玩家体验游戏应用程序。
图1所示的图像获取的架构图中,如果应用服务器101接收到用户终端103发送的针对目标应用程序的操作指令时,应用服务器101通知与用户终端103中用户标识对应的虚拟机104运行目标应用进程102,其他的虚拟机分别运行应用进程1和应用进程2。应用服务器101通过在虚拟机104上运行所述目标应用程序得到与用户标识对应的用户界面图像,可选的,通过虚拟机得到与用户标识对应的用户界面图像的方式可以为:虚拟机104调用目标应用进程102的图像渲染函数进行渲染得到渲染后的窗口图像数据;然后虚拟机104对所述窗口图像数据和其他参数进行合成处理,得到待显示的图像;在虚拟机104的用户界面中显示所述待显示的图像; 应用服务器101截取虚拟机104当前的用户界面,将截取到的图像作为获取到与用户标识对应的用户界面图像,并将所述用户界面图像发送给用户终端103。这样,用户标识所标识的用户便可通过用户终端103查看到与目标应用进程102对应的用户界面图像。
在一个实施例中,图像获取的架构图还可以如图2所示,在图2所示的图像获取的架构图中,201表示应用服务器,所述应用服务器201中运行有为目标应用程序的应用功能提供服务支持的应用进程集合;202表示应用进程集合中与用户标识对应的目标应用进程;203表示以所述用户标识登录了目标应用程序的用户终端,也即所述用户标识对应的用户终端。本申请实施例中,应用进程集合中的各个应用进程中还配置有数据截取模块,例如在目标应用进程中配置有数据截取模块2021,在应用进程1中配置的数据截取模块,以及在应用进程2中配置的数据截取模块。
在一个实施例中,在目标应用进程202的运行过程中,应用服务器201可以调用目标应用进程202中的数据截取模块2021截取窗口图像数据,所述窗口图像数据是在执行所述目标应用进程202中的图像渲染函数进行渲染时得到的;再对窗口图像数据进行图像合成处理,得到待显示的用户界面图像。可选的,应用服务器201中可配置有画面合成器204,应用服务器201可以将调用数据截取模块截取到窗口图像数据发送给画面合成器204,以调用画面合成器204对窗口图像数据进行图像合成处理得到待显示用户界面图像。在得到用户界面图像后,应用服务器201将携带用户界面图像的通知消息发送给用户终端203,以便于用户终端203在用户界面上显示所述用户界面图像。
通过比较图1和图2所示的图像获取的架构图发现:在图1中需要为每个用户标识部署对应的虚拟机,以使得每个用户标识对应的应用进程在相应的虚拟机中运行。当大量部署虚拟机时需要消耗较多的资源开销。此外,由于虚拟机中的显卡不能较好的支持多媒体,在部署虚拟机时需要在应用服务器101中单独配置一个高性能显卡给虚拟机使用,如图1中1011所示。但是,由于该高性能显卡1011只用来支持虚拟机中的多媒体业务,在其他情况下显卡处于闲置状态,导致显卡使用率不 高而造成成本浪费。
另外,为了实现图1中应用服务器101、虚拟机104以及目标应用进程102三者之间的通信,还需要在虚拟机104中部署相应的服务模块,如图1中1041所示。一般情况下,通过用户权限控制用户通过服务模块1041对虚拟机的操作权限,这样,容易存在某些恶意用户突破权限控制并且通过服务模块1041进行信息窃取以及破坏,导致图像获取的安全性较差。
此外,在图1所示的架构图中各个应用进程分别运行在相应的虚拟机上而不是运行在应用服务器上,导致应用进程的运行效率低。
相比于图1,图2所示的图像获取的架构图中,不进行虚拟机的部署可节省资源消耗;并且,应用服务器201只向用户终端203展示与目标应用进程相关的用户界面图像,提高了应用的安全性;此外,图2所示的图像获取的架构图中应用进程直接运行在应用服务器201上,而不是运行在虚拟机上,提高应用进程的运行效率。综上所述,下面将具体介绍基于图2所示的图像获取架构的本申请实施例。
基于图2所述的图像获取的架构图,本申请实施例提供了一种图像获取方法,所述图像获取方法可以应用在应用服务器中,所述应用服务器用于执行为实现目标应用程度的应用功能提供服务支持的应用进程集合,例如图1中的应用服务器101或者图2中的应用服务器201。采用本申请实施例所述的图像获取方法可以获取到某个用户标识对应的用户界面图像,并将得到的用户界面图像携带在通知消息中发送给所述用户标识对应的用户终端,以指示用户终端在用户界面显示所述用户界面图像。这样,用户标识所标识的用户可通过用户界面图像对目标应用程序进行操作。与现有技术通过虚拟机获取待显示的用户界面图像的方式相比,本申请实施例节省了在应用服务器中布置虚拟机的资源开销。
在一个实施例中,本申请实施例所述的图像获取方法可以应用在云游戏的应用场景中,所谓云游戏是指:多个用户标识对应的游戏运行在云端服务器中,由云端服务器将各个游戏中的游戏场景渲染为游戏动画后,实时传输到各个用户标识对应 的用户终端,以供各个用户标识所标识的用户针对游戏画面输入相应的操作。
本申请实施例应用于云游戏的应用场景时,所述应用服务器相当于云端服务器,所述用户界面图像相当于游戏画面。例如,图3a为本申请实施例提供的一种云端服务器的桌面视图,例如,该云端服务器为图1中的应用服务器101或者图2中的应用服务器201,在图3a中300表示云端服务器的桌面。在图3a中,云端服务器中同时运行了多个支持游戏运行的应用进程,采用本申请实施例所述的图像获取方法,云端服务器可以获取到多个应用进程对应的游戏画面,在云端服务器的桌面视图中可同时展示每个应用进程对应的游戏画面例如301、302以及303。由于一个应用进程对应一个用户标识,所述每个应用进行对应的游戏画面就是指每个用户标识对应的游戏画面。例如,301表示与用户标识A对应的游戏画面,302表示与用户标识B对应的游戏画面,303表示与用户标识C对应的游戏画面。
采用本申请实施例所述的图像获取方法,可以实现将各个用户标识对应的游戏画面,分别传输给对应的用户终端以进行显示,对于某个用户标识所标识的用户来说,在用户终端的用户界面中看到的游戏画面是云端服务器中与该用户标识对应的游戏画面,不会显示其他用户标识对应的游戏画面。也即各个用户对应的游戏互相之间不会影响。例如,如图3b为多个用户标识中用户标识A对应的用户终端的用户界面图像,例如,该用户终端为图1中的用户终端103和图2中的用户终端203,可见,用户标识A对应的用户终端的用户界面中只展示了与用户标识A对应的游戏画面301。
基于上述描述,本申请实施例提供了一种图像获取方法的流程示意图,如图4所示。图4所述的图像获取方法可以由服务器执行,具体地可由服务器的处理器执行。例如,应用于图1中所示的应用服务器101或者图2所示的应用服务器201中,该应用服务器用于执行为实现目标应用程序的应用功能提供服务支持的应用进程集合。图4所述的图像获取方法可包括如下步骤:
S401、从应用进程集合中获取用户标识对应的目标应用进程。
其中,所述用户标识是用于表示用户的身份,所述用户标识可以包括用户的身份证、用户的手机号以及用户的登录账号中的任意一种或多种。
在一个实施例中,所述获取用户标识对应的目标应用进程可以是应用服务器在接收到用户终端发送的用户操作请求后执行的。其中,所述用户终端是指基于用户标识完成登录了的终端设备,所述终端设备可包括手机、平板、台式电脑等任何具备视频解压能力的设备;所述用户操作请求可以包括针对目标应用程序的登录操作、窗口切换操作以及登出操作等操作中的任意一种或多种。
例如,如果用户标识所指示的用户初次在用户终端中登录到目标应用程序时,此时所述用户终端接收的用户发送的用户操作请求可以指登录操作;如果用户终端中基于用户标识已登录了目标应用程序,此时用户操作消息可以指登出操作,或者对目标应用程序包括的各个窗口的窗口切换操作,或者对目标应用程序的某个应用功能的运行请求等。
S402,当检测到目标应用进程中的图像渲染函数被调用时,调用数据截取模块截取当前生成的多个窗口图像数据。
本步骤中,窗口图像数据是在运行所述目标应用进程的过程中,执行所述目标应用进程中的图像渲染函数时得到的。可选的,调用数据截取模块截取当前生成的窗口图像数据的实施方式,如图5a所示。图5a是本申请实施例提供的一种获取窗口图像数据的示意图,该获取方法可以由服务器执行,例如,图1中的应用服务器101或者图2中的应用服务器201,包括如下步骤:
S4021,执行目标应用进程中的图像渲染函数;
S4022,当检测到目标应用进程中的图像渲染函数被调用时,拦截图像渲染函数;
S4023,调用数据截取模块,获取图像渲染函数针对各个窗口生成的窗口图像数据。
S4024,图像渲染函数执行结束。
其中,图像渲染指将三维的光能传递处理转换为一个二维图像的过程,图像渲染函数是指可以实现图像渲染功能的函数。
在图像渲染函数中常用的图像渲染方式可以包括:开放图像库(Open Graphics Library,OpenGL)、图形设备接口(Graphics Device Interface,GDI)和编程接口DirectX三种。其中,OpenGL是一个底层3D图形库,只提供渲染功能,可以在不同平台之间进行移植;DirectX主要为GPU设计需要硬件支持,并且DirectX更适合对游戏应用的图像渲染;GDI为广谱设计的,也可以应用在对游戏应用的图像渲染中,但是效果不及DirectX。
在一个实施例中,针对不同窗口进行渲染时,目标应用进程的图像渲染函数中对应各个窗口的渲染方式可不同,例如,对目标应用程序的第一窗口进行渲染时,图像渲染函数中包括的渲染方式可以为GDI;在对目标应用程序的第二窗口进行渲染时,图像渲染函数中包括的渲染方式可以为OpenGL。
在一个实施例中,参考图5b-图5d为本申请实施例提供的一种窗口图像数据的示意图,假设目标应用程序为游戏应用,所述游戏应用可包括三个窗口,分别为“主页”窗口、“商店”窗口和“视频”窗口,每个窗口对应一个窗口图像数据,具体为,“主页”窗口图像数据如图5b所示,“商店”窗口图像数据如图5c所示,“视频”窗口图像数据如图5d所示。
S403、对多个窗口图像数据进行图像合成处理,得到待显示的用户界面图像。
在一个实施例中,所述图像合成处理可以指,按照各个窗口图像数据的合成顺序,将各个窗口图像数据依次叠加。所述合成顺序可以指各个窗口图像数据在图像合成处理过程中被处理的顺序,合成顺序越靠前的窗口图像数据,在图像合成处理时越先被处理。所述合成顺序可以是依据各个窗口图像数据对应的窗口层级标识确定的。所述窗口层级标识用于表示窗口的层级高低,窗口层级标识所述指示的层级越高,表明窗口图像数据的合成顺序越靠后。
例如,假设在图5b-图5d中,三个窗口图像数据的窗口层级标识由高到低的顺序依次为“视频”,“商店”,“主页”,根据上述窗口层级标识越高的窗口图像数据的合成顺序越靠后,则图5b-图5d中三个窗口图像数据的合成顺序可以为“主页”、“商店”和“视频”;如果按照合成顺序将三个窗口数据进行合成处理,得到的待显 示的用户界面图像可如图3b所示。可选的,应用服务器可以调用画面合成器将各个窗口图像数据进行合成处理。
S404、将携带用户界面图像的通知消息发送给用户标识对应的用户终端,以使用户终端在用户界面上显示用户界面图像。
其中,所述通知消息用于指示用户终端在用户界面显示用户界面图像,所述用户终端在用户界面中显示用户界面图像的示意图可图3b所示。
与现有技术中通过部署虚拟机获取用户界面图像的方式相比,本申请实施例中应用服务器拦截执行目标应用程序过程中,调用渲染函数得到的窗口图像数据,然后对窗口图像数据进行图像合成处理得到用户界面图像,可节省应用服务器的资源开销。
请参见图6a,是本申请实施例提供的另一种图像获取方法的流程示意图。图6a所述的图像获取方法可以由服务器执行,具体地可由服务器的处理器执行。例如,可以应用于图1所示的应用服务器101或者图2所示的应用服务器201中,所述应用服务器用于执行为实现目标应用程序的应用功能提供服务支持的应用进程集合。图6a所述的图像获取方法可包括如下步骤:
S601,从应用进程集合中获取用户标识对应的目标应用进程。
此步骤与上述步骤401相同,在此不再赘述。
S602,当检测到目标应用进程中的图像渲染函数被调用时,确定图像渲染函数中包括的渲染方式。
窗口图像数据的数量为多个,各个窗口图像数据是执行目标应用程序的一个或多个图像渲染函数得到的,各个图像渲染函数中包括的渲染方式可以为OpenGL、GDI和DirectX中的任意一种。
S603,根据渲染方式,确定数据截取模块中与渲染方式对应的数据截取子模块。
所述数据截取模块可包括三个截取子模块,所述三个截取子模块的分别于上述三种渲染方式相对应,具体地,数据截取模块包括的是三个截取子模块可以为 OpenGL截取子模块、GDI截取子模块以及DirectX截取子模块。可选的,各个截取子模块与渲染方式的对应关系可以为:OpenGL截取子模块对应的渲染方式为OpenGL、GDI截取子模块对应的渲染方式为GDI以及DirectX截取子模块对应的渲染方式为DirectX。应当理解的,调用与渲染方式相对应的截取子模块来获取窗口图像数据,可以保证截取到的窗口图像数据的准确性。
在一个实施例中,本申请实施例提供的数据截取模块的示意图可如图8所示,图8所示的数据截取模块可包括GDI截取子模块、DirectX截取子模块和OpenGL截取子模块。
S604,调用数据截取子模块,截取在执行图像渲染函数时得到的多个窗口图像数据。
在执行步骤S602-S604的同时,还可以执行如下步骤S605:
S605,获取为目标应用进程设置的画面显示参数,并根据画面显示参数,确定各个窗口图像数据的窗口层级标识。
其中,所述画面显示参数可以包括各个窗口图像数据对应的窗口是否被最小化、最大化、以及各个窗口之间的排布关系中的任意一种或多种,比如所述各个窗口之间的排布关系可以为,窗口A放在窗口B之前;窗口C放在窗口B之后等。
在一个实施例中,所述画面显示参数可以是存储于沙箱中的,在本申请实施例中所述的沙箱是一种机制,可以使得应用服务器运行的多个应用进程互相隔离。
在一个实施例中,沙箱中存储的为目标应用进程设置的画面显示参数,可以是应用服务器为目标应用进程默认设置的参数,例如,假设目标应用程序为游戏应用,应用服务器为用于实现游戏应用的游戏功能提供支持的目标应用进程设置的参数可以包括:“主页”窗口数据对应的窗口放在最前面显示,“商店”窗口图像数据对应的窗口放在“主页”窗口图像数据对应窗口之后,“视频”窗口图像数据对应的窗口放在“商店”图像数据对应的窗口之后;此时画面显示参数可以为:“主页”窗口图像数据放在最前面显示,“商店”窗口图像数据放在“主页”窗口图像数据之后,“视频”窗口图像数据放在“商店”窗口图像数据之后。
在其他实施例中,沙箱中存储的为目标应用进程设置的画面显示参数,可以是根据用户终端发送的用户操作请求和应用服务器为目标应用进程默认设置的参数确定的。例如,假设目标应用程序为游戏,为用于实现游戏应用的游戏功能提供服务支持的目标应用进程设置的参数可以包括:“主页”窗口图像数据对应的放在最前面显示,“商店”窗口图像数据对应的窗口放在“主页”窗口图像数据对应的窗口之后,“视频”窗口图像数据对应的窗口放在“商店”窗口图像数据对应的窗口之后;检测到用户操作请求中包括从“主页”窗口图像数据对应的窗口切换到“视频”窗口图像数据对应的窗口,此时根据用户操作请求和应用服务器为应用程序默认设置的参数确定的画面显示参数可以包括:“视频”窗口图像数据对应的窗口放在最前面显示,“商店”窗口图像数据对应的窗口放在“主页”窗口图像数据对应的窗口之后,“主页”窗口图像数据对应的窗口放在“视频”窗口图像数据对应的窗口之后。
在获取到为目标应用进程设置的画面显示参数之后,可以根据画面显示参数确定各个窗口图像数据的窗口层级标识,所述窗口层级标识用于唯一表示对应的窗口图像数据,并且所述窗口层级标识指示了窗口图像数据之间的层级关系。
所述窗口层级标识可以以一个或多个数字表示,例如1,2,3,或者(1,2),(2,1)、(3,1)等;或者所述窗口层级表示也可以以其他形式表示。在本申请实施例中,如果用数字表示窗口层级标识,如果用于代表各个窗口层级标识的数字的个数不相同,数字个数越多的窗口层级标识越高,例如(1,2)代表的窗口层级标识高于1代表的窗口层级标识;如果用于代表各个窗口层级标识的数字个数都为N个,N为大于1的任意正整数,且前N-1个数字相同,而第N个数字不相同,则各个数字的之和越小的窗口层级标识越高,例如(1,2)代表的窗口层级标识高于(1,5)代表的窗口层级标识。其中,窗口层级标识(1,n)(n为大于等于1的任意正整数)对应的窗口是窗口层级标识(1)对应的窗口的子窗口;窗口层级标识(1,n,m)对应的窗口是窗口层级标识(1,n)对应的窗口的子窗口。基于上述描述可知,子窗口对应的窗口层级标识高于其父窗口对应的窗口层级标识。
S606,根据窗口层级标识,对多个窗口图像数据进行图像合成处理,得到待显 示的用户界面图像。
由于窗口层级标识表征了各个窗口之间的层级关系,那么按照层级关系从高到低的顺序,对多个窗口图像数据进行图像合成处理。
S607,将携带用户界面图像的通知消息发送给用户标识对应的用户终端,以使用户终端在用户界面上显示用户界面图像。
此步骤与上述步骤S404相同,在此不再赘述。
请参见图6b,是本申请实施例提供的又一种图像获取方法的流程示意图。图6b所述的图像获取方法可以由服务器执行,具体地可由服务器的处理器执行。例如,可以应用于图1所示的应用服务器101或者图2所示的应用服务器201中,所述应用服务器用于执行为实现目标应用程序的应用功能提供服务支持的应用进程集合,图6b所述的图像获取方法可包括如下步骤:
S701,从应用进程集合中获取用户标识对应的目标应用进程。
此步骤与上述步骤401相同,在此不再赘述。
S702,获取为目标应用进程设置的画面显示参数,并根据画面显示参数,确定各个窗口图像数据的窗口层级标识。
此步骤与上述步骤605相同,在此不再赘述。
S703,根据各个窗口图像数据的窗口层级标识,生成图像二叉树。
为了提高便利性,本申请实施例中根据各个窗口图像数据的窗口层级标识生成一个图像二叉树。所述图像二叉树中可包括多个二叉树节点,每个二叉树节点存储了窗口层级标识。
在一个实施例中,所述根据所述各个窗口图像数据的窗层级标识生成图像二叉树,可包括:
S7031,根据各个窗口图像数据的窗口层级标识,构建图像多叉树。
其中,图像多叉树包括N个层级,每个层级中包括M个节点,其中,N和M都是大于等于1的正整数,所述图像多叉树中相邻两个层级中包括为父子关系的父 节点和子节点,所述图像多叉树的同一层级中具有同一父节点的各个节点互为兄弟节点;
S7032,将图像多叉树转换为图像二叉树。
其中,是将所述图像多叉树的各个层级中的第一个节点转换为图像二叉树的左节点,将与所述第一个节点互为兄弟节点关系的其他节点转换为图像二叉树的右节点。
例如,假设根据各个窗口图像数据的窗口层级标识构建的图像多叉树如图7a所示,假设图像二叉树中包括3个层级,如701-703所示,各个层级的节点中的数字表示该节点对应的窗口图像数据的窗口层级标识,由图可见,在701层级中包一个节点,701层级包括的节点可以称为根节点,本申请实施例中所述根节点为用户终端的桌面图像数据。所述用户终端的桌面图像数据可以是应用服务器接收到用户终端在接收到的用户输入的桌面图像数据的选择操作后发送的;在702层级中包括三个节点,在703中包括三个节点。
其中,702层级中的窗口层级标识(1,1)和(1,2)对应的窗口是窗口层级标识1对应的窗口的子窗口,因此窗口层级标识(1,1)和(1,2)代表的子节点的父节点都是窗口层级标识1代表的节点,所以窗口层级标识(1,1)和(1,2)代表的节点互为兄弟节点。同理可知的,窗口层级标识1,2,3,代表的节点互为兄弟节点。
将图7a所示的图像多叉树转化为图像二叉树是指:将图像多叉树中701-703各个层级中的第一节点转化为图像二叉树的左节点,例如,在701层级中第一个节点为窗口层级标识0代表的节点,将此节点作为图像二叉树的左节点,701层级中不包括其他节点,则继续对702层级进行转换;在702层级中,第一个节点为窗口层级标识1代表的节点,将此节点继续作为图像二叉树的左节点,702层级中还包括窗口层级标识1代表的节点的两个兄弟节点,分别为窗口层级标识2代表的节点和窗口层级标识3代表的节点,将这两个节点依次作为图像二叉树的右节点。按照上述过程遍历图像二叉树中所有节点,所有节点遍历结束后便可得到将如图7b所示的 图像二叉树。
应当理解的,将窗口层级标识2代表的节点和窗口层级标识3代表的节点依次作为图像二叉树的右节点可以理解为:窗口层级标识2代表的节点作为窗口层级标识1代表的节点的右节点,窗口层级标识3代表的节点作为窗口层级标识2代表的节点的右节点。
S704,按照预设的第一遍历顺序对图像二叉树进行遍历,得到各个窗口图像数据的合成顺序。
其中,所述预设的第一(即二叉树图像合成)遍历顺序可以指前序遍历,所述前序遍历是指:根节点-左节点-右节点。例如,在图7b所示的图像二叉树中前序遍历的结果为:0-1-(1,1)-(1,2)-2-3-(3,1)。按照预设的二叉树显示遍历顺序对所述图像二叉树进行遍历,得到窗口图像数据中各个窗口图像数据的合成顺序0-1-(1,1)-(1,2)-2-3-(3,1)。
S705,按照预设的第二遍历顺序对图像二叉树进行遍历,得到各个窗口图像数据的图像捕获顺序,并按照图像捕获顺序,调用数据截取模块截取得到多个窗口图像数据。
在一实施例中,所述预设的第二(二叉图像获取)遍历顺序可以指后序遍历,所述后序遍历的遍历顺序为:左节点-右节点-根节点。例如,在图7b中所示的图像二叉树中后序遍历的结果为:(1,1)-(1,2)-(3,1)-3-2-1-0。按照预设的二叉树图像获取的遍历顺序对所述图像二叉树进行遍历,得到窗口图像数据中各个窗口图像数据的图像捕获顺序即为(1,1)-(1,2)-(3,1)-3-2-1-0。
在一个实施例中,在获取到窗口图像数据后,可以将窗口图像数据进行图像合成处理,便可得到待显示的用户界面图像。
S706,按照合成顺序,对各个窗口图像数据进行图像合成处理,得到待显示的用户界面图像。
在一个实施例中,所述步骤706的实施方式可以为:为每个窗口图像数据分配一块子画布,将相应的窗口图像数据放在对应的子画布上,按照上述的合成顺序将 各个子画布进行组合,得到待显示的用户界面图像。
应当理解的,按照上述合成顺序进行组合,以实现窗口层级标识越高的窗口图像数据越后进行组合,这样一来,得到的用户界面图像中窗口层级标识高的窗口图像数据会覆盖掉窗层级标识低的窗口图像数据。
S707,将携带用户界面图像的通知消息发送给用户标识对应的用户终端。
此步骤同上述步骤404,在此不再赘述。
在本申请另一个实施例中,图6c是本申请实施例提供的另一种图像获取方法的流程示意图,该图像获取方法可以由服务器执行,具体地可由服务器的处理器执行。例如,可以应用于图1所示的应用服务器101或者图2所示的应用服务器201中。如图6c所示,在图6b所述的实施例基础之上,在步骤S702之后,还可以执行:
S708,根据各个窗口图像数据的窗口层级标识,生成消息响应二叉树。
其中,消息响应二叉树中包括多个消息节点,每个消息节点中记录有相应的窗口图像数据对应的窗口位置信息和窗口尺寸信息。其中,为了简化操作,在根据各个窗口图像数据的窗口层级标识生成消息响应二叉树时,可以采用与步骤S703相同的方法,这样一来,消息响应二叉树和图像二叉树具有相同的结构。例如,根据各个窗口图像数据的窗口层级标识生成的消息二叉树也可以如图7b所示。
图像二叉树每个节点存储了窗口图像数据的窗口层级标识,与图像二叉树不同的是,消息响应二叉树中每个节点不仅存储了窗口图像数据的窗口层级标识,还存储有各个窗口图像数据对应的窗口位置信息和窗口尺寸信息。其中,所述窗口位置信息可以包括窗口的坐标,所述窗口的坐标可以是以像素坐标形式表示的例如(3像素,6像素),也可以是以世界坐标形式表示的例如(10,15);所述窗口尺寸信息可以包括窗口的宽度和高度等信息。
S709,基于消息响应二叉树,确定各个窗口图像数据的显示位置信息。
其中,所述显示位置信息可以包括窗口图像数据的窗口位置信息和窗口尺寸信息中的任意一种或两种。
S710,根据显示位置信息和合成顺序,对各个窗口图像数据进行合成处理,得到待显示的用户界面图像。
在对各个窗口图像数据进行合成时,在知道了各个窗口图像数据之后,还需要知道各个窗口图像数分别应用位于什么位置,例如,如图5b-图5d所示,在已知了“主页”窗口图像数据以及“商店”窗口图像数据之后,还需要两个窗口图像数据应该以多大尺寸,在什么位置进行组合,才能得到如图3b所示的用户界面图像。因此,在对各个窗口图像数据进行合成时,可以通过消息响应二叉树获取各个窗口图像数据的显示位置信息。
基于上述描述,本申请实施例提供了如图9所示的图像合成处理的示意图,在图9中数据获取模块用于获取目标应用进程中的各个窗口图像数据,应用服务器调用画面合成器根据图像二叉树和消息响应二叉树对各个窗口图像数据进行图像合成处理。
在一个实施例中,步骤S709中,基于消息响应二叉树,确定各个窗口图像数据的显示位置信息的实施方式可包括:
针对各个窗口图像数据中的第一窗口图像数据(所述第一窗口图像数据为各个窗口图像数据中的任意一个窗口图像数据),当检测到当前正在对第一窗口图像数据进行图像合成处理,应用服务器可以根据图像二叉树获取第一窗口图像数据的窗口层级标识;再从消息响应二叉树中查找与第一窗口数据的窗口层级标识对应的窗口位置信息和窗口尺寸信息,将查找到的窗口位置信息和窗口尺寸信息确定为第一窗口图像数据的显示位置信息。
在其他实施例中,消息响应二叉树除了应用于图像合成处理过程中,还可以用于查找与用户操作消息对应的目标窗口图像数据,以便于应用服务器响应于用户操作消息对目标窗口图像数据进行处理。其中,所述用户操作消息是用户通过用户终端输入的与用户界面图像相关的操作。具体地,在图6c中还包括如下步骤:
S711,接收用户终端发送的用户操作消息,从用户操作消息中获取操作位置信息。
其中,所述用户操作消息中包括的操作位置信息是指所述用户输入操作的位置处的位置信息,比如用户点击、双击或长按用户界面图像中某个位置,将此位置的位置信息确定为操作位置信息。
S712,按照预设的第三遍历顺序遍历消息响应二叉树,将操作位置信息满足预设位置条件的消息节点确定为目标消息节点。
具体而言,目标消息节点为在消息响应二叉树中第一个查找到的满足预设位置条件的消息节点,其中,所述操作位置信息满足预设位置条件是指所述操作位置信息在所述目标消息节点记录的窗口位置信息和窗口尺寸信息所限定的范围内;
S713,将目标消息节点对应的窗口图像数据作为目标窗口图像数据,响应于用户操作消息,对目标窗口图像数据所对应的窗口进行处理。
由于应用服务器中运行多个用户标识对应的应用进程,应用服务器中显示多个用户标识对应的用户界面图像,不同用户标识对应的用户界面图像可能重叠在一起如图10a所示,在图10a中A和B表示不同用户标识对应的用户界面图像;同一用户标识对应的用户界面图像中各个窗口图像数据之间也可能有重叠如图10b所示,在图10b中A和B表示同一用户标识对应的用户界面图像中的两个窗口图像数据。
不同用户标识对应不同的应用进程,不同应用进程对应的画面显示参数不同,由于图像二叉树和消息二叉树是根据画面显示参数确定的窗口层级标识生成的,所以不同应用进程之间的图像二叉树和消息响应二叉树也不相同,因此可以理解为不同用户标识对应的消息响应二叉树不同。
针对图10a的情况,如果用户操作消息中包括的操作位置信息指示的位置位于A和B的重叠区域,应用服务器需要知道用户输入的用户操作消息是针对哪个用户界面图像的。可选的,可以根据不同用户标识对应的消息响应二叉树来确定当前接收到的用户操作消息是针对哪个用户界面图像的。
对于图10b的情况,应用服务器需要知道用户操作消息是针对用户界面图像中哪个窗口图像数据的。可选的,可以按照预设的第三(即二叉树窗口查找)遍历顺序,遍历消息响应二叉树,确定出目标消息节点,进而确定出目标节点对应的目标 窗口图像数据。
所述预设的第三遍历顺序可以指中序遍历,所述中序遍历是指按照:左节点-根节点-右节点的顺序对二叉树进行遍历。例如,假设消息响应二叉树如图7b所示,中序遍历结果为:(1,1)-(1,2)-1-2-(3,1)-(3)-0。
基于上述图像获取方法的描述,本申请实施例还公开了一种图像获取装置,所述图像获取装置可以应用于服务器中,例如,图1中的应用服务器101或者图2中的应用服务器201,所述应用服务器用于执行为实现目标应用程序的应用功能提供服务支持的应用进程集合,所述图像获取装置可以执行图4和图6a-图6c所示的方法。请参见图11,所述图像获取装置1100可运行如下单元:
获取单元1101,用于从应用进程集合中获取用户标识对应的目标应用进程;
截取单元1102,用于当检测到所述获取单元1101获取到的目标应用进程中的图像渲染函数被调用时,调用数据截取模块,截取当前生成的多个窗口图像数据;
第一处理单元1103,用于对所述截取单元1102截取到的多个窗口图像数据进行图像合成处理,得到待显示的用户界面图像;
发送单元1104,用于将携带所述第一处理单元1103得到的用户界面图像的通知消息发送给所述用户标识对应的用户终端,以使所述用户终端在用户界面上显示所述用户界面图像。
在一个实施例中,获取单元1101,还用于获取为所述目标应用进程设置的画面显示参数;
装置1100还包括:
确定单元1105,用于根据所述获取单元1102获取的画面显示参数,确定各个窗口图像数据的窗口层级标识;
第一处理单元1103,还用于根据所述确定单元1105确定的窗口层级标识,对所述多个窗口图像数据进行图像合成处理。
在一个实施例中,装置1100还包括:
第一生成单元1106,用于根据所述确定单元1105确定的各个窗口图像数据的窗口层级标识,生成图像二叉树;
第一处理单元1103还用于,按照预设的第一遍历顺序,对所述第一生成单元1106生成的图像二叉树进行遍历,得到各个窗口图像数据的合成顺序;按照所述合成顺序,对所述多个窗口图像数据进行图像合成处理。
在一个实施例中,第一生成单元1106用于,根据所述确定单元1105确定的各个窗口图像数据的窗口层级标识,构建图像多叉树;将所述图像多叉树转换为所述图像二叉树。
在一个实施例中,装置1100还包括:
第一生成单元1106,用于根据所述确定单元1105确定的各个窗口图像数据的窗口层级标识,生成图像二叉树;
截取单元1102用于,按照预设的第二遍历顺序,对所述第一生成单元1106生成的图像二叉树进行遍历,得到各个窗口图像数据的图像捕获顺序;按照所述图像捕获顺序,调用所述数据截取模块,截取得到所述多个窗口图像数据。
在一个实施例中,截取单元1102用于,确定所述图像渲染函数中包括的渲染方式;根据所述渲染方式,确定所述数据截取模块中与所述渲染方式对应的数据截取子模块;调用所述数据截取子模块,截取在执行所述图像渲染函数时得到的多个窗口图像数据。
在一个实施例中,装置1100还包括:
第二生成单元1107,用于根据所述确定单元1105确定的各个窗口图像数据的窗口层级标识,生成消息响应二叉树;
第一处理单元1103用于,基于所述第二生成单元1107生成的消息响应二叉树,确定各个窗口图像数据的显示位置信息;根据所述显示位置信息和所述合成顺序,对所述多个窗口图像数据进行合成处理。
在一个实施例中,所述消息响应二叉树中包括多个消息节点,获取单元1101还用于,接收所述用户终端发送的用户操作消息,从所述用户操作消息中获取操作位 置信息;
装置1100还包括:
第二处理单元1108用于,按照预设的第三遍历顺序,遍历所述第二生成单元1107生成的消息响应二叉树,将所述获取单元1101获取的操作位置信息满足预设位置条件的消息节点,确定为目标消息节点;将所述目标消息节点对应的窗口图像数据作为目标窗口图像数据,响应于所述用户操作消息,对所述目标窗口图像数据所对应的窗口进行处理。
根据本申请的另一个实施例,图11所示的图像获取装置中的各个单元可以分别或全部合并为一个或若干个另外的单元来构成,或者其中的某个(些)单元还可以再拆分为功能上更小的多个单元来构成,这可以实现同样的操作,而不影响本申请的实施例的技术效果的实现。上述单元是基于逻辑功能划分的,在实际应用中,一个单元的功能也可以由多个单元来实现,或者多个单元的功能由一个单元实现。在本申请的其它实施例中,基于图像获取装置也可以包括其它单元,在实际应用中,这些功能也可以由其它单元协助实现,并且可以由多个单元协作实现。
根据本申请的另一个实施例,可以通过在包括中央处理单元(CPU)、随机存取存储介质(RAM)、只读存储介质(ROM)等处理元件和存储元件的例如计算机的通用计算设备上运行能够执行如图4或图6a-6c中所示的相应方法所涉及的各步骤的计算机程序(包括程序代码),来构造如图11中所示的图像获取装置,以及来实现本申请实施例的图像获取方法。所述计算机程序可以记载于例如计算机可读记录介质上,并通过计算机可读记录介质装载于上述计算设备中,并在其中运行。
基于上述方法实施例以及装置实施例的描述,本申请实施例还提供一种服务器,所述服务器可以为图1中所示的应用服务器101或者图2中所示的应用服务器201,所述服务器用于执行为实现目标应用程序的应用功能提供服务支持的应用进程集合。请参见图12,该服务器至少包括处理器1201、网络接口1202以及计算机存储介质1203。所述网络接口1202在连接网络时用于接收或发送数据。
计算机存储介质1203可以存储在服务器的存储器中,所述计算机存储介质1203用于存储计算机程序,所述计算机程序包括程序指令,所述处理器1201用于执行所述计算机存储介质1203存储的程序指令。处理器1201或称CPU(Central Processing Unit,中央处理器))是终端的计算核心以及控制核心,其适于实现一条或多条指令,具体适于加载并执行一条或多条指令从而实现相应方法流程或相应功能;在一个实施例中,本申请实施例所述的处理器1201可以用于执行上述图4、图6a-6c中所述的图像获取方法。
本申请实施例还提供了一种计算机存储介质(Memory),所述计算机存储介质是终端中的记忆设备,用于存放程序和数据。可以理解的是,此处的计算机存储介质既可以包括终端中的内置存储介质,当然也可以包括终端所支持的扩展存储介质。计算机存储介质提供存储空间,该存储空间存储了终端的操作系统。并且,在该存储空间中还存放了适于被处理器1201加载并执行的一条或多条指令,这些指令可以是一个或多个计算机程序(包括程序代码)。需要说明的是,此处的计算机存储介质可以是高速RAM存储器,也可以是非不稳定的存储器(non-volatile memory),例如至少一个磁盘存储器;可选的还可以是至少一个位于远离前述处理器的计算机存储介质。
在一个实施例中,可由处理器1201加载并执行计算机存储介质中存放的一条或多条指令,以实现上述有关图像获取方法实施例中的方法的相应步骤;具体实现中,计算机存储介质中的一条或多条指令由处理器1201加载并执行上述图4、图6a-6c中所述的图像获取方法。
以上所揭露的仅为本申请较佳实施例而已,当然不能以此来限定本申请之权利范围,因此依本申请权利要求所作的等同变化,仍属本申请所涵盖的范围。

Claims (19)

  1. 一种图像获取方法,其特征在于,由服务器执行,包括:
    从应用进程集合中获取用户标识对应的目标应用进程;
    当检测到所述目标应用进程中的图像渲染函数被调用时,调用数据截取模块,截取当前生成的多个窗口图像数据;
    对所述多个窗口图像数据进行图像合成处理,得到待显示的用户界面图像;
    将携带所述用户界面图像的通知消息发送给所述用户标识对应的用户终端,以使所述用户终端在用户界面上显示所述用户界面图像。
  2. 如权利要求1所述的方法,还包括:
    获取为所述目标应用进程设置的画面显示参数;
    根据所述画面显示参数,确定各个窗口图像数据的窗口层级标识;
    其中,所述对所述多个窗口图像数据进行图像合成处理,得到待显示的用户界面图像,包括:
    根据所述窗口层级标识,对所述多个窗口图像数据进行图像合成处理。
  3. 如权利要求2所述的方法,还包括:
    根据各个窗口图像数据的窗口层级标识,生成图像二叉树;
    所述根据所述窗口层级标识,对所述多个窗口图像数据进行图像合成处理,包括:
    按照预设的第一遍历顺序,对所述图像二叉树进行遍历,得到各个窗口图像数据的合成顺序;
    按照所述合成顺序,对所述多个窗口图像数据进行图像合成处理。
  4. 如权利要求3所述的方法,其中,所述根据各个窗口图像数据的窗口层级标 识,生成图像二叉树,包括:
    根据各个窗口图像数据的窗口层级标识,构建图像多叉树;
    将所述图像多叉树转换为所述图像二叉树。
  5. 如权利要求2所述的方法,还包括:
    根据各个窗口图像数据的窗口层级标识,生成图像二叉树;
    所述当检测到所述目标应用进程中的图像渲染函数被调用时,调用数据截取模块,截取当前生成的多个窗口图像数据,包括:
    按照预设的第二遍历顺序,对所述图像二叉树进行遍历,得到各个窗口图像数据的图像捕获顺序;
    按照所述图像捕获顺序,调用所述数据截取模块,截取得到所述多个窗口图像数据。
  6. 如权利要求1所述的方法,其中,所述当检测到所述目标应用进程中的图像渲染函数被调用时,调用数据截取模块,截取当前生成的多个窗口图像数据,包括:
    确定所述图像渲染函数中包括的渲染方式;
    根据所述渲染方式,确定所述数据截取模块中与所述渲染方式对应的数据截取子模块;
    调用所述数据截取子模块,截取在执行所述图像渲染函数时得到的多个窗口图像数据。
  7. 如权利要求3所述的方法,还包括:
    根据各个窗口图像数据的窗口层级标识,生成消息响应二叉树;
    其中,所述按照所述合成顺序,对所述多个窗口图像数据进行图像合成处理,包括:
    基于所述消息响应二叉树,确定各个窗口图像数据的显示位置信息;
    根据所述显示位置信息和所述合成顺序,对所述多个窗口图像数据进行合成处理。
  8. 如权利要求7所述的方法,其中,所述消息响应二叉树中包括多个消息节点,所述方法还包括:
    接收所述用户终端发送的用户操作消息,从所述用户操作消息中获取操作位置信息;
    按照预设的第三遍历顺序,遍历所述消息响应二叉树,将所述操作位置信息满足预设位置条件的消息节点,确定为目标消息节点;
    将所述目标消息节点对应的窗口图像数据作为目标窗口图像数据,响应于所述用户操作消息,对所述目标窗口图像数据所对应的窗口进行处理。
  9. 如权利要求8所述的方法,其中,每个消息节点中记录有相应的窗口图像数据对应的窗口位置信息和窗口尺寸信息;
    所述操作位置信息满足预设位置条件,是指所述操作位置信息在所述目标消息节点记录的窗口位置信息和窗口尺寸信息所限定的范围内。
  10. 一种图像获取装置,其特征在于,所述装置配置于服务器中,所述装置包括:
    获取单元,用于从应用进程集合中获取用户标识对应的目标应用进程;
    截取单元,用于当检测到所述获取单元获取到的目标应用进程中的图像渲染函数被调用时,调用数据截取模块,截取当前生成的多个窗口图像数据;
    第一处理单元,用于对所述截取单元截取到的多个窗口图像数据进行图像合成处理,得到待显示的用户界面图像;
    发送单元,用于将携带所述第一处理单元得到的用户界面图像的通知消息发送给所述用户标识对应的用户终端,以使所述用户终端在用户界面上显示所述用户界 面图像。
  11. 如权利要求10所述的装置,其中,所述获取单元还用于,获取为所述目标应用进程设置的画面显示参数;
    所述装置还包括:
    确定单元,用于根据所述获取单元获取的画面显示参数,确定各个窗口图像数据的窗口层级标识;
    所述第一处理单元还用于,根据所述确定单元确定的窗口层级标识,对所述多个窗口图像数据进行图像合成处理。
  12. 如权利要求11所述的装置,还包括:
    第一生成单元,用于根据所述确定单元确定的各个窗口图像数据的窗口层级标识,生成图像二叉树;
    所述第一处理单元还用于,按照预设的第一遍历顺序,对所述第一生成单元生成的图像二叉树进行遍历,得到各个窗口图像数据的合成顺序;按照所述合成顺序,对所述多个窗口图像数据进行图像合成处理。
  13. 如权利要求12所述的装置,其中,所述第一生成单元用于,根据所述确定单元确定的各个窗口图像数据的窗口层级标识,构建图像多叉树;将所述图像多叉树转换为所述图像二叉树。
  14. 如权利要求11所述的装置,还包括:
    第一生成单元,用于根据所述确定单元确定的各个窗口图像数据的窗口层级标识,生成图像二叉树;
    其中,所述截取单元用于,按照预设的第二遍历顺序,对所述第一生成单元生成的图像二叉树进行遍历,得到各个窗口图像数据的图像捕获顺序;按照所述图像 捕获顺序,调用所述数据截取模块,截取得到所述多个窗口图像数据。
  15. 如权利要求10所述的装置,其中,所述截取单元用于,确定所述图像渲染函数中包括的渲染方式;根据所述渲染方式,确定所述数据截取模块中与所述渲染方式对应的数据截取子模块;调用所述数据截取子模块,截取在执行所述图像渲染函数时得到的多个窗口图像数据。
  16. 如权利要求12所述的装置,还包括:
    第二生成单元,用于根据所述确定单元确定的各个窗口图像数据的窗口层级标识,生成消息响应二叉树;
    其中,所述第一处理单元用于,基于所述第二生成单元生成的消息响应二叉树,确定各个窗口图像数据的显示位置信息;根据所述显示位置信息和所述合成顺序,对所述多个窗口图像数据进行合成处理。
  17. 如权利要求16所述的装置,其中,所述消息响应二叉树中包括多个消息节点,所述获取单元还用于,接收所述用户终端发送的用户操作消息,从所述用户操作消息中获取操作位置信息;
    所述装置还包括:
    第二处理单元用于,按照预设的第三遍历顺序,遍历所述第二生成单元生成的消息响应二叉树,将所述获取单元获取的操作位置信息满足预设位置条件的消息节点,确定为目标消息节点;将所述目标消息节点对应的窗口图像数据作为目标窗口图像数据,响应于所述用户操作消息,对所述目标窗口图像数据所对应的窗口进行处理。
  18. 一种服务器,其特征在于,还包括:
    处理器,适于实现一条或多条指令;以及,
    计算机存储介质,所述计算机存储介质存储有一条或多条指令,所述一条或多条指令适于由所述处理器加载并执行如权利要求1-9任一项所述的图像获取方法。
  19. 一种计算机存储介质,其特征在于,所述计算机存储介质中存储有计算机程序指令,所述计算机程序指令被处理器执行时,用于执行如权利要求1-9任一项所述的图像获取方法。
PCT/CN2020/092076 2019-05-29 2020-05-25 图像获取方法、装置、服务器及存储介质 WO2020238846A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
SG11202105855QA SG11202105855QA (en) 2019-05-29 2020-05-25 Image acquisition method, device, server and storage medium
KR1020217023112A KR102425168B1 (ko) 2019-05-29 2020-05-25 이미지 획득 방법, 기기, 서버 및 저장 매체
EP20813666.3A EP3979589A4 (en) 2019-05-29 2020-05-25 IMAGE CAPTURE METHOD, DEVICE, SERVER AND STORAGE MEDIUM
JP2021548678A JP7297080B2 (ja) 2019-05-29 2020-05-25 画像取得方法、画像取得装置、サーバ、及びコンピュータプログラム
US17/443,481 US11606436B2 (en) 2019-05-29 2021-07-27 Image obtaining method and apparatus, server, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910468135.3 2019-05-29
CN201910468135.3A CN110213265B (zh) 2019-05-29 2019-05-29 图像获取方法、装置、服务器及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/443,481 Continuation US11606436B2 (en) 2019-05-29 2021-07-27 Image obtaining method and apparatus, server, and storage medium

Publications (1)

Publication Number Publication Date
WO2020238846A1 true WO2020238846A1 (zh) 2020-12-03

Family

ID=67789895

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/092076 WO2020238846A1 (zh) 2019-05-29 2020-05-25 图像获取方法、装置、服务器及存储介质

Country Status (7)

Country Link
US (1) US11606436B2 (zh)
EP (1) EP3979589A4 (zh)
JP (1) JP7297080B2 (zh)
KR (1) KR102425168B1 (zh)
CN (1) CN110213265B (zh)
SG (1) SG11202105855QA (zh)
WO (1) WO2020238846A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110213265B (zh) 2019-05-29 2021-05-28 腾讯科技(深圳)有限公司 图像获取方法、装置、服务器及存储介质
CN112099884A (zh) * 2020-08-11 2020-12-18 西安万像电子科技有限公司 一种图像渲染方法及装置
CN112650899B (zh) * 2020-12-30 2023-10-03 中国平安人寿保险股份有限公司 数据可视化渲染方法、装置、计算机设备及存储介质
CN114820882A (zh) * 2022-04-19 2022-07-29 北京百度网讯科技有限公司 一种图像获取方法、装置、设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160379400A1 (en) * 2015-06-23 2016-12-29 Intel Corporation Three-Dimensional Renderer
CN106843897A (zh) * 2017-02-09 2017-06-13 腾讯科技(深圳)有限公司 一种截取游戏画面的方法和装置
CN107071331A (zh) * 2017-03-08 2017-08-18 苏睿 图像显示方法、装置和系统、存储介质及处理器
CN108939556A (zh) * 2018-07-27 2018-12-07 珠海金山网络游戏科技有限公司 一种基于游戏平台的截图方法及装置
CN110213265A (zh) * 2019-05-29 2019-09-06 腾讯科技(深圳)有限公司 图像获取方法、装置、服务器及存储介质
CN109582425B (zh) * 2018-12-04 2020-04-14 中山大学 一种基于云端与终端gpu融合的gpu服务重定向系统及方法

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6687335B1 (en) * 1997-03-31 2004-02-03 Southwestern Bell Telephone Company User interface and system to facilitate telephone circuit maintenance and testing
JP2003190640A (ja) * 2001-12-27 2003-07-08 Koei:Kk 描画コマンド制御方法、記録媒体、描画コマンド制御装置及びプログラム
US7486294B2 (en) * 2003-03-27 2009-02-03 Microsoft Corporation Vector graphics element-based model, application programming interface, and markup language
US7260784B2 (en) * 2003-05-07 2007-08-21 International Business Machines Corporation Display data mapping method, system, and program product
US8079037B2 (en) * 2005-10-11 2011-12-13 Knoa Software, Inc. Generic, multi-instance method and GUI detection system for tracking and monitoring computer applications
US20070300179A1 (en) * 2006-06-27 2007-12-27 Observe It Ltd. User-application interaction recording
US20110289117A1 (en) * 2010-05-19 2011-11-24 International Business Machines Corporation Systems and methods for user controllable, automated recording and searching of computer activity
US9361464B2 (en) * 2012-04-24 2016-06-07 Jianqing Wu Versatile log system
CN103455234A (zh) 2012-06-01 2013-12-18 腾讯科技(深圳)有限公司 显示应用程序界面的方法及装置
US9069608B2 (en) * 2013-03-06 2015-06-30 Vmware, Inc. Method and system for providing a roaming remote desktop
US10387546B1 (en) * 2013-06-07 2019-08-20 United Services Automobile Association Web browsing
EP3092622A4 (en) * 2014-01-09 2017-08-30 Square Enix Holdings Co., Ltd. Methods and systems for efficient rendering of game screens for multi-player video game
KR102455232B1 (ko) * 2015-06-23 2022-10-17 삼성전자 주식회사 콘텍스트 기반 탭 관리를 위한 방법 및 전자 장치
US10179290B2 (en) * 2016-07-21 2019-01-15 Sony Interactive Entertainment America Llc Method and system for accessing previously stored game play via video recording as executed on a game cloud system
CN106371824A (zh) * 2016-08-23 2017-02-01 广州优视网络科技有限公司 便携式设备及应用程序弹出消息显示控制方法和装置
US10180815B1 (en) * 2017-09-06 2019-01-15 Xmpie (Israel) Ltd. Systems and methods for variable data printing

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160379400A1 (en) * 2015-06-23 2016-12-29 Intel Corporation Three-Dimensional Renderer
CN106843897A (zh) * 2017-02-09 2017-06-13 腾讯科技(深圳)有限公司 一种截取游戏画面的方法和装置
CN107071331A (zh) * 2017-03-08 2017-08-18 苏睿 图像显示方法、装置和系统、存储介质及处理器
CN108939556A (zh) * 2018-07-27 2018-12-07 珠海金山网络游戏科技有限公司 一种基于游戏平台的截图方法及装置
CN109582425B (zh) * 2018-12-04 2020-04-14 中山大学 一种基于云端与终端gpu融合的gpu服务重定向系统及方法
CN110213265A (zh) * 2019-05-29 2019-09-06 腾讯科技(深圳)有限公司 图像获取方法、装置、服务器及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3979589A4 *

Also Published As

Publication number Publication date
CN110213265B (zh) 2021-05-28
KR20210104143A (ko) 2021-08-24
US11606436B2 (en) 2023-03-14
CN110213265A (zh) 2019-09-06
JP7297080B2 (ja) 2023-06-23
US20210360086A1 (en) 2021-11-18
EP3979589A1 (en) 2022-04-06
SG11202105855QA (en) 2021-07-29
JP2022534346A (ja) 2022-07-29
EP3979589A4 (en) 2022-08-10
KR102425168B1 (ko) 2022-07-25

Similar Documents

Publication Publication Date Title
WO2020238846A1 (zh) 图像获取方法、装置、服务器及存储介质
US9549007B2 (en) User interface widget unit sharing for application user interface distribution
WO2019169913A1 (zh) 一种数据处理的方法、装置、服务器和系统
US8924985B2 (en) Network based real-time virtual reality input/output system and method for heterogeneous environment
CN112791399B (zh) 云游戏画面的显示方法及装置、系统、介质、电子设备
EP3311565B1 (en) Low latency application streaming using temporal frame transformation
US8819139B2 (en) Virtual desktop infrastructure (VDI) login acceleration
WO2024066828A1 (zh) 一种数据处理方法、装置、设备、计算机可读存储介质及计算机程序产品
CN108073350A (zh) 一种用于云渲染的对象存储系统和方法
CN108074210A (zh) 一种用于云渲染的对象获取系统和方法
CN114237840A (zh) 资源交互方法、装置、终端及存储介质
CN115794139A (zh) 镜像数据处理方法、装置、设备以及介质
CN114222003A (zh) 服务调用方法、系统、装置、设备及存储介质
CN112328356B (zh) Android与Windows的互通方法、装置、存储介质及计算机设备
CN104038511B (zh) 一种资源管理方法及装置
US11425219B1 (en) Smart stream capture
CN111111175A (zh) 一种游戏画面生成方法、装置和移动终端
WO2023035619A1 (zh) 一种场景渲染方法、装置、设备及系统
CN116310232A (zh) 数字藏品的数据处理方法、设备、存储介质及程序产品
US10715846B1 (en) State-based image data stream provisioning
CN114785848A (zh) 电子设备之间的协同交互和协同方法、装置和系统
EP4038545A1 (en) Freeview video coding
KR20170055881A (ko) 온라인 게임 제공 시스템 및 방법
US20240013461A1 (en) Interactive Animation Generation
KR102503119B1 (ko) 트리-기반 포인트 클라우드 압축 미디어 스트림을 위한 방법 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20813666

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20217023112

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2021548678

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020813666

Country of ref document: EP

Effective date: 20220103