WO2020093825A1 - 应用程序的虚拟场景识别与交互键位匹配方法及计算设备 - Google Patents

应用程序的虚拟场景识别与交互键位匹配方法及计算设备 Download PDF

Info

Publication number
WO2020093825A1
WO2020093825A1 PCT/CN2019/109985 CN2019109985W WO2020093825A1 WO 2020093825 A1 WO2020093825 A1 WO 2020093825A1 CN 2019109985 W CN2019109985 W CN 2019109985W WO 2020093825 A1 WO2020093825 A1 WO 2020093825A1
Authority
WO
WIPO (PCT)
Prior art keywords
texture
preset
key
rendering
picture
Prior art date
Application number
PCT/CN2019/109985
Other languages
English (en)
French (fr)
Inventor
吴智文
黄源超
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to EP19881725.6A priority Critical patent/EP3879496A4/en
Priority to JP2020563939A priority patent/JP7221989B2/ja
Publication of WO2020093825A1 publication Critical patent/WO2020093825A1/zh
Priority to US17/085,649 priority patent/US11511188B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/80Creating or modifying a manually drawn or painted image using a manual input device, e.g. mouse, light pen, direction keys on keyboard
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/22Setup operations, e.g. calibration, key configuration or button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Definitions

  • the present application relates to the field of Internet technology, and in particular, to a virtual scene recognition and interactive key matching method and computing device for application programs.
  • Embodiments of the present application provide a method, apparatus, computer storage medium, and computing device for virtual scene recognition and interactive key matching of application programs.
  • the virtual scene recognition and interactive key matching method of the application program is applied to a computing device capable of building or running a virtual operating system, and the virtual scene recognition and interactive key matching method of the application program includes:
  • the rendering data includes a preset texture picture
  • a key configuration file is called to perform matching processing between the preset texture picture and a corresponding physical key of the computing device
  • the key configuration information being associated with the matched physical keys.
  • the virtual scene recognition and interactive key matching device of the application program is applied to a computing device capable of building or running a virtual operating system.
  • the virtual scene recognition and interactive key matching device of the application program includes:
  • a startup module used to start an application program in the virtual operating system
  • a rendering detection module configured to perform rendering processing on the virtual scene to be presented by the application program to obtain rendering data used for rendering the virtual scene
  • a texture picture recognition module used to recognize whether the rendering data obtained by the rendering detection module includes a preset texture picture
  • the key setting module is used to call the corresponding key configuration file to execute the correspondence between the preset texture image and the computing device when the texture image recognition module recognizes that the virtual scene includes a preset texture image Physical key matching processing;
  • the display module is configured to present the virtual scene and key configuration information corresponding to the virtual scene on the display screen of the computing device, and the key configuration information is associated with the matched physical keys.
  • the computer storage medium stores multiple instructions, and the instructions are suitable for being loaded by the processor and performing the following steps:
  • the rendering data includes a preset texture picture
  • a key configuration file is called to perform matching processing between the preset texture picture and a corresponding physical key of the computing device
  • the key configuration information being associated with the matched physical keys.
  • the computing device can build or run a virtual operating system
  • the computing device includes: a processor and a memory, computer readable instructions are stored on the memory, and the computer readable instructions are processed by the The controller performs the following operations:
  • the rendering data includes a preset texture picture
  • a key configuration file is called to perform matching processing between the preset texture picture and a corresponding physical key of the computing device
  • the key configuration information being associated with the matched physical keys.
  • rendering processing is performed on a virtual scene to be presented of the application program, to obtain rendering data used to render the virtual scene, and to identify the Set texture picture, when it is recognized that there is a preset texture picture in the virtual scene, the corresponding key configuration file is called to complete the matching process of the texture picture and the corresponding physical key of the computing device, in the The virtual scene and the key configuration information corresponding to the virtual scene are presented on the screen of the computing device.
  • the embodiments of the present application can quickly and efficiently identify the game scene based on the recognition of the texture picture in the mobile game application, and dynamically Automatically and reasonably arrange game operation buttons in the recognized game scenes, making game operation button settings intelligent, game players can easily use the mouse and keyboard to experience the game, and no longer need to set a large number of game operation buttons by themselves, so that game players can have more Good game operation experience, so this kind is based on game scene recognition Key arrangement is more intelligent, better results.
  • FIG. 1 is a schematic diagram of an application environment of a method and device for virtual scene recognition and interactive key matching of an application program provided according to an embodiment of the present application;
  • FIG. 2 is a schematic flowchart of a method for virtual scene recognition and interactive key matching of an application program according to an embodiment of the present application
  • FIG. 3A is a schematic diagram of the relationship between textures and texture pictures in an exemplary mobile game according to an embodiment of the present application
  • FIG. 3B is a schematic diagram of the location information of the texture image in the texture image in the example mobile game provided according to an embodiment of the present application;
  • FIG. 4A is a schematic diagram of an example game scene provided according to an embodiment of the present application.
  • 4B is a schematic diagram of another example game scene provided according to an embodiment of the present application.
  • FIG. 5 is a schematic flow chart of rendering a virtual scene to be presented by an application program according to an embodiment of the present application to obtain rendering data for rendering the virtual scene;
  • FIG. 6 is a schematic flowchart of identifying a preset texture image in rendering data according to an embodiment of the present application
  • FIG. 7 is a schematic flowchart illustrating steps 601 and 602 in FIG. 6 according to an embodiment of the present application.
  • FIG. 8 is a further schematic flowchart of a virtual scene recognition and interactive key matching method for an application program provided according to an embodiment of the present application;
  • the corresponding key configuration file is called to complete the texture image and the corresponding physical key of the computing device Schematic diagram of the matching process
  • FIG. 10 is a further schematic flowchart of a method for virtual scene recognition and interactive key matching of an application program according to an embodiment of the present application
  • FIG. 11 is a schematic diagram of an apparatus for virtual scene recognition and interactive key matching of an application program provided according to an embodiment of the present application
  • FIG. 12 is a schematic diagram of a sticker image recognition module in a virtual scene recognition and interactive key matching device for an application program provided according to an embodiment of the present application;
  • FIG. 13 is a schematic diagram of a texture judgment module and a texture coordinate judgment module in a texture picture recognition module provided according to an embodiment of the present application;
  • FIG. 14 is a schematic diagram for further description of an apparatus for virtual scene recognition and interactive key matching of an application program according to an embodiment of the present application;
  • 15 is a schematic diagram of a key setting module of an apparatus for virtual scene recognition and interactive key matching according to an embodiment of the present application
  • 16 is a schematic diagram of a hardware structure of a computing device where an apparatus for virtual scene recognition and interactive key matching of an application program according to an embodiment of the present application is provided.
  • Embodiments of the present application provide a method and apparatus for virtual scene recognition and interactive key matching of an application program, which are applied to a computing device capable of building or running a virtual operating system, and the virtual operating system is used to run the foregoing application program.
  • the foregoing application program may be a mobile game application program installed on a computing device, which is not specifically limited in this embodiment of the present application.
  • the game in the process of running the mobile game application on the computing device, the game can be quickly and efficiently identified based on the recognition of the texture pictures in the mobile game application Scene, dynamically and automatically lay out the game operation buttons in the recognized game scene in real time and dynamically, so that the game operation button settings are intelligent, and game players do not need to set the game operation buttons themselves in the game scene, only need to automatically set the game operation according to the computing device Buttons, through the operation of the corresponding physical buttons on the keyboard and mouse, you can operate mobile games.
  • This button setting method based on game scene recognition allows gamers to have a better game operation experience, which is more intelligent and more effective. good.
  • FIG. 1 is a schematic diagram of an application environment of a method and a device for virtual scene recognition and interactive key matching of an application program provided according to an embodiment of the present application.
  • the application environment includes a computing device 10 and a server 20.
  • the computing device 10 and the server 20 may be personal computers.
  • the computing device 10 is configured with a keyboard 11 and a mouse 12.
  • the mobile device game application and the virtual scene recognition and interaction key matching device 30 of the application are installed on the computing device 10.
  • the virtual scene recognition and interactive key matching device 30 of the application program can be embodied in the form of a PC emulator, for example, a PC Android emulator that simulates Android mobile games. In the following, the PC emulator is also simply referred to as an emulator.
  • the virtual scene recognition and interactive key matching device 30 of the application program can execute the virtual scene recognition and interactive key matching method of the application program, which is used to start the application program in the virtual operating system;
  • the virtual scene performs rendering processing to obtain the rendering data used to render the virtual scene; identify the preset texture picture in the rendering data; when it is recognized that the virtual scene has a preset texture picture, call the corresponding key bit A configuration file to complete the matching process between the texture image and the corresponding physical key of the computing device; present the virtual scene and key configuration information corresponding to the virtual scene on the display screen of the computing device.
  • the server 20 can be used to coordinate with the update of the mobile phone game, and deliver updated pre-configuration data to the computing device 10, for example, the pre-configuration data includes operation coordinate data of keys to be set in the game, and so on.
  • the keys in the embodiment of the present application include the keys on the keyboard 11 and the left and right keys of the mouse 12 and so on.
  • “mobile phone game” is also simply referred to as "game”.
  • FIG. 2 is a schematic flowchart of a virtual scene recognition and interactive key matching method for an application program according to an embodiment of the present application.
  • the virtual scene recognition and interactive key matching method of the application program can be applied to a computing device, such as a personal computer.
  • the virtual scene recognition and interactive key matching method of the application includes the following steps:
  • Step 201 The computing device starts an application program in the virtual operating system.
  • a first operating system is running on a computing device
  • the application program is an application program suitable for a second operating system. Therefore, when it is necessary to run the application for the second operating system on the computing device
  • a virtual operating system identical to the second operating system needs to be built and run on the computing device running the first operating system to run the application program suitable for the second operating system.
  • One feasible way is to install an emulator on the computing device and construct a virtual operating system that is the same as the second operating system with the virtual machine to provide an operating environment for the application program.
  • the first operating system is, for example, an operating system such as Windows or Linux
  • the second operating system is, for example, an Android operating system
  • the virtual operating system is, for example, a virtual Android operating system.
  • Virtual machine Virtual Machine
  • the virtual operating system is a brand new virtual image of the second operating system. It has exactly the same functions as the real second operating system.
  • operations are performed in this independent virtual operating system, for example, virtual operations
  • the system can independently install and run software, save data, have its own independent desktop, and will not affect the first operating system on the computing device.
  • the virtual machine in the embodiment of the present application may be, for example, a VirtualBox virtual machine.
  • a Windows operating system runs on a computing device
  • a mobile game application is suitable for an Android operating system.
  • an emulator can be run on the computing device, and the emulator is constructed with a virtual machine A virtual Android operating system that provides a running environment for mobile game applications. After running the simulator, the user can download the mobile game application in the interface of the simulator and start the mobile game application.
  • Step 202 The computing device performs rendering processing on the virtual scene to be presented by the application program to obtain rendering data used to render the virtual scene.
  • various virtual scenes may be presented on the display screen according to the operation logic of the application program itself.
  • the application needs to perform corresponding rendering processing operations on these virtual scenes to be presented in the background.
  • the embodiment of the present application obtains the information of the virtual scene that needs to be presented on the display screen by detecting the rendering processing operation of the application in the background, and then knows the information of the current application based on the obtained information of the virtual scene that needs to be presented on the display screen Run status for corresponding processing.
  • the corresponding processing is, for example, additional processing in addition to the running logic of the application itself, such as setting up plug-ins, developing upper-layer applications based on third-party application logic, and so on.
  • the corresponding processing may be to enable the physical key to take effect, and replace the touch operation of the touch screen originally supported by the application program with the physical key operation in the virtual scene.
  • the application program can render the virtual scene through the rendering function.
  • an application When an application is rendering a virtual scene, it may perform multiple rendering operations and call the rendering function multiple times.
  • the simulator will detect each call to the rendering function and obtain the rendering data associated with each call to the rendering function, that is, the simulator The rendering data used to render the virtual scene can be obtained.
  • a game scene needs to be presented on a display screen according to its game logic, and the game scene is, for example, a game frame.
  • the emulator can detect the calling of the rendering function by the application program during each rendering operation during the rendering of the game frame, and thereby obtain various rendering data used by the rendering function. After the rendering of a game frame ends, there will be a refresh operation to start rendering the next game frame.
  • the simulator can determine each call to the rendering function during the rendering of a game frame by detecting the refresh operation.
  • step 203 the computing device identifies whether the preset texture image is included in the rendering data.
  • the simulator recognizes the preset texture picture in the rendering data.
  • the preset texture image needs to determine the current running state of the application and identify the current virtual scene based on it, for example, to perform additional operations on the application in addition to its own running logic. Processed texture image.
  • the simulator detects that the rendering data used to render the virtual scene includes a preset texture image, it may consider the texture image to take effect.
  • the application is a mobile game application
  • some of the texture pictures are texture pictures of virtual operation buttons that can be operated by game players , Used when the mobile game application is running on the mobile phone, the game player can use the finger to touch the virtual operation button displayed on the touch screen to perform the corresponding game operation; some texture pictures are not the virtual operation button texture pictures, can not be Game player operations, such as some suggestive pictures.
  • the simulator designer can pre-set the corresponding physical keys for the texture pictures of virtual operation buttons in the mobile game, so as to simulate the touch in the case of a touch screen with key functions Features. For other texture pictures, you can not set the physical button.
  • the simulator designer can configure the texture image that needs to set the corresponding physical button as a configuration file, which is called a texture image configuration file.
  • the simulator recognizes whether there is a preset texture image in the texture image configuration file in the rendering data.
  • the texture picture configuration file may further include a cyclic redundancy check code (CRC, Cyclic Redundancy Check) of the texture where the texture picture needs to be detected, and the position of the texture picture in the texture Coordinate (Coordinate), texture image identification (TextureId), height (height), width (width), remarks (comm) and other information.
  • CRC Cyclic Redundancy Check
  • the logo of the texture picture may be represented by a number, for example.
  • the texture picture is usually rectangular, and the position coordinates of the texture picture in the texture where it is located can be represented by the position coordinates of the four corners of the texture picture in the texture.
  • the sticker image configuration file includes the following content, for example:
  • the texture image configuration file of this example is configured to require the simulator to detect the location of the texture in the rendered virtual scene with a loop check code of 1576852035 during the application running process is ⁇ 1061945344,1038876672,1061945344,1045823488,1063518208,1038876672 , 1063518208, 1045823488 ⁇ 's sticker pictures.
  • each texture has its corresponding logo, and the position of each texture image in the texture in the texture is fixed.
  • the texture can be in the format of a picture.
  • the texture identification may be, for example, the name of the texture, and the name of the texture may be a string of numbers or a character string.
  • FIG. 3A is a schematic diagram of the relationship between textures and texture pictures in an exemplary mobile game according to an embodiment of the present application.
  • the sticker 30 is a sticker in the PUBG (Player Unknown ’s Battle Grounds) game.
  • the sticker 30 there are various small sticker pictures, for example, sticker pictures 31, 32, 33, and so on.
  • FIG. 3B is a schematic diagram of the location information of the texture picture in the texture picture in the example mobile phone game according to the embodiment of the present application.
  • there are five texture pictures in the texture 34 labeled 35, 36, 37, 38, and 39, respectively.
  • the position coordinates of the texture picture 35 in the texture 34 can be expressed as ⁇ (0.000000, 0.462891), (0.000000, 0.255859), (0.207031, 0.255859), (0.207031, 0.462891) ⁇ .
  • the coordinate (0.000000, 0.462891) represents the lower left corner of the texture picture 35
  • (0.000000, 0.255859) represents its upper left corner
  • the coordinates (0.207031, 0.255859) represents its upper right corner
  • the coordinates (0.207031, 0.462891) represents its lower right corner.
  • OpenGL ES OpenGL for Embedded System
  • OpenGL ES is an extension of OpenGL, and is mainly used for embedded devices such as mobile phones, PDAs (PDAs, Personal Assistants) and game consoles.
  • the value of Coordinate can be configured to correspond to the coordinate values of the four corners of the texture picture 35.
  • the rendering function When rendering the game frame, the rendering function will use the texture identification and the location coordinates of the texture image in the texture as the data of the texture image.
  • the emulator detects that the application renders the game frame on the display screen according to the game logic, it compares the identification of the texture in the rendering data used by the rendering function and the position coordinates of the texture image in the texture with the preset texture in the texture image configuration file And the location coordinates of the texture image in the preset texture image, to determine whether the rendering data and the game frame include the preset texture image that needs to be set to the corresponding physical button.
  • Step 204 When it is recognized that the rendering data includes a preset texture picture, the computing device calls a corresponding key configuration file to perform matching processing between the texture picture and the corresponding physical key of the computing device.
  • the simulator designer can set the texture image and the related information of the corresponding physical key as a configuration file, for example, a key configuration file.
  • the key configuration file required by the application can be pre-stored in the computing device.
  • the simulator when running the application on the computing device, the simulator, when recognizing that the virtual scene includes the preset texture picture, searches the key configuration file according to the preset texture picture to obtain the preset texture For the key information corresponding to the picture, set the key icon corresponding to the physical key in the virtual scene, and enable the physical key to take effect and be operable.
  • the key configuration file can be downloaded from the server to the computing device, and the update can be downloaded from the server.
  • the key configuration file can include the information of the texture image that needs to be set to the corresponding physical key, such as the texture image ID (TextureId), the texture image description or name (for example, describing the function corresponding to the texture image), and the texture image (Comm).
  • TextId the texture image ID
  • TextId the texture image description or name
  • Comm the texture image
  • the key position configuration file also includes related information of the physical keys corresponding to the texture pictures, such as the name of the physical keys (Itemname), the coordinate of the screen position corresponding to the physical keys (Point_x, Point_y), and the functional description of the physical keys (Description).
  • the function of the physical button may be, for example, the function of a virtual operation button in the game associated with the corresponding texture picture.
  • the key configuration file can also be configured to enable the physical key to take effect when detecting the rendered texture image, and display the corresponding key icon on the display screen, which is in a user-operable state.
  • the identification in the key configuration file and the identification in the texture image configuration file are the same or corresponding, so that the relevant information of the same texture image in the key configuration file and the texture image configuration file can be related.
  • the emulator may find a pre-configuration according to the key configuration file when determining that the game scene includes the preset texture image in the texture image configuration file
  • the key information corresponding to the set texture picture enables the corresponding physical key to take effect.
  • the key configuration file can be pre-set with the F key, G key, H key, etc. on the keyboard, the texture image corresponding to the surrounding environment corresponds to the Alt key, and the aiming texture image corresponds to the right key and jump
  • the texture picture corresponds to the space bar
  • the crouched texture picture corresponds to the C key
  • the prone texture picture corresponds to the Z key
  • the loaded texture picture corresponds to the R key
  • the direction control texture picture corresponds to the W key (forward), S key (backward), A key (leftward), D key (rightward) left leaning texture picture corresponds to Q key
  • right leaning texture picture corresponds to E key
  • on / off microphone texture picture corresponds to Y key
  • on / off sound texture picture corresponds to T key
  • Weapon switching texture pictures correspond to number keys 1, 2, 3, etc.
  • backpack corresponds to Table key
  • pressing the front texture picture corresponds to the Q key
  • lifting the front texture picture corresponds to the E key
  • getting off the texture picture corresponds to the F
  • the computing device configures the key configuration file, it can be configured to use one key for multiple purposes, so that in different game scenarios, the same physical key can have different operating functions.
  • Step 205 The computing device presents the virtual scene and key configuration information corresponding to the virtual scene on the display screen.
  • the key configuration information may be key related information, such as a key icon of a physical key.
  • the display screen is a computing device, which is a hardware device under the first operating system
  • the virtual screen and the button configuration information may be displayed under the first operating system on the display screen.
  • the simulator presents the virtual scene on the display screen under the first operating system, and displays the set button icon of the physical button at the screen position coordinates corresponding to the physical button.
  • the coordinates of the virtual scene under the virtual operating system may be converted into coordinates under the first operating system, and then the virtual scene is presented on the display screen.
  • Scenes For example, the coordinates of the virtual scene under the Android operating system are converted into the coordinates under the Window operating system.
  • the name of the physical button may be set on a background picture to form the button icon.
  • the background picture may be a circle, an ellipse, etc., and the center position of the background picture is set within the display area of the texture picture corresponding to the physical button, so that the two have overlapping areas.
  • the key icon of the physical key may also be set at a preset position. When setting the button icon, you can try to avoid blocking the main image of the sticker image.
  • the background image can be set to have a certain transparency.
  • FIG. 4A is a schematic diagram of an example game scene provided according to an embodiment of the present application, in which is a game frame of a PUBG game.
  • the simulator sets the backpack texture picture 400 to the Table key, the pick-up texture pictures 402, 403, and 404 to the F key, G key, and H key.
  • the surrounding environment texture picture 405 corresponds to the Alt key
  • the aim texture picture 406 Corresponding to the right mouse button
  • the jump texture picture 407 corresponds to the space key
  • the squat texture picture 408 corresponds to the C key
  • the prone texture picture 409 corresponds to the Z key
  • the loaded texture picture 410 corresponds to the R key
  • the left-hand shot texture picture 411 corresponds to the left mouse button.
  • the direction control texture picture 412 corresponds to the W key, S key, A key, and D key
  • the left leaning texture picture 413 corresponds to the Q key
  • the right leaning texture picture 414 corresponds to the E key
  • the on / off microphone texture picture 415 corresponds to the Y key
  • the sound texture picture 416 corresponds to the T key
  • the weapon switch texture pictures 417, 418, and 419 correspond to the numeric keys 1, 2, and 3.
  • FIG. 4B is a schematic diagram of another example game scene provided according to an embodiment of the present application.
  • the simulator sets the drop-off texture image 420, the head-on texture image 421, the head-on texture image 422, and the whistle texture image 423.
  • the corresponding keys are: F key , Q button, E button, left mouse button, etc.
  • the keys corresponding to the on / off microphone texture image 425, the on / off sound texture image 426, and the game setting texture image 427 are respectively: the Y key, the T key, and the Esc key.
  • Set to view the surrounding environment texture image 424 corresponds to the Alt key
  • the direction control texture image 428 corresponds to the W key
  • S key A key
  • the backpack texture image 429 corresponds to the Table key.
  • the keys on the keyboard are not clearly indicated as mouse keys.
  • rendering processing is performed on a virtual scene to be presented of the application program, to obtain rendering data used to render the virtual scene, and to identify a preset in the rendering data Texture image, when it is recognized that there is a preset texture image in the virtual scene, the corresponding key configuration file is called to complete the matching processing between the texture image and the corresponding physical key of the computing device.
  • the virtual scene and the button configuration information corresponding to the virtual scene are presented on the screen of the computing device.
  • the game operation buttons are automatically and reasonably arranged in the scene to make the game operation button settings intelligent. Game players do not need to set the game operation buttons themselves in the game scene. They only need to automatically set the game operation buttons according to the computing device. Physical keys can be used to operate mobile games, Kind of scene recognition based on the game's key arrangement allows gamers a better game play experience, more intelligent, better results.
  • FIG. 5 is a schematic diagram of a process of rendering a virtual scene to be presented by an application program according to an embodiment of the present application to obtain rendering data used to render the virtual scene.
  • FIG. 5 may be a further description of step 202 in FIG. 2 or a further description of the same or similar steps in other embodiments.
  • step 202 may include:
  • Step 501 The computing device obtains the rendering data in the vertex cache object associated with the rendering function by detecting the call of the rendering function that renders the virtual scene to be rendered.
  • OpenGL Open Graphics
  • OpenGL defines a cross-programming language, cross-platform application programming (API, Application Programming Interface) specification.
  • API Application Programming Interface
  • the graphics rendering requirements of the application are ultimately handled by rendering-related programs such as graphics card drivers that implement the OpenGL protocol.
  • rendering-related programs such as graphics card drivers that implement the OpenGL protocol.
  • a series of computer graphics operations such as vertex transformation, primitive assembly, texture mapping, and rasterization may be required.
  • the glDrawElements function is a primitive rendering function that obtains data from an array to draw or render primitives.
  • the primitives are, for example, triangles, lines, and points.
  • the rendered primitive is a triangle as an example for description.
  • the function parameters in the glDrawElements function include, for example, the primitive type (mode), which is used to describe the type of primitive to be drawn or rendered; the count value (Count), which represents the vertex connected by the primitive type Total number, according to different modes, Count is less than or equal to the number of vertices of a single mode type primitive * number of primitives; the type of index value is one of the following values: GL_UNSIGNED_BYTE, GL_UNSIGNED_SHORT, GL_UNSIGNED_INT; , A pointer to the storage location of the index.
  • the glDrawElements function When the glDrawElements function is called, the function creates a series of primitives by indexing a series of elements using the number represented by Count.
  • OpenGL includes various library functions, such as basic library functions, utility library functions, auxiliary library functions, utility library functions, and so on.
  • API functions for operating vertex buffer objects are provided in OpenGL.
  • VBO is a memory buffer created by OpenGL in the storage space of the graphics card, which is used to store various attribute information of vertices.
  • the foregoing attribute information may include, for example, vertex color data, vertex coordinates, texture coordinates, vertex normal vectors, and so on.
  • Vertex is a basic concept in computer graphics. In computer graphics, two-dimensional or three-dimensional objects can be drawn by triangles (primitives), for example. Each triangle has three vertices, and each vertex has a 3D position.
  • the 3D positions of the vertices of a triangle can be defined with an array. Since OpenGL works in 3D space and renders a 2D triangle, you can set the z coordinate of the vertex of the triangle to 0.0. Vertex coordinates as input, input to the vertex shader, the vertex shader will create a memory on the graphics processor (GPU, Graphics, Processing, Unit) to store the vertex coordinates, and configure how OpenGL interprets these memories, and specify how to send the vertex coordinates to Graphics card.
  • the vertex coordinates are three-dimensional coordinates, and the display screen is two-dimensional. By calculating the vertex coordinates of the vertex, the final display position of the vertex on the display screen can be obtained.
  • rendering functions such as the above glDrawElements function can read various attribute information of vertices from VBO and use them.
  • the simulator may obtain the rendering data in the vertex cache object VBO associated with the rendering function by detecting the call of the glDrawElements function that renders the virtual scene to be rendered.
  • the rendering data in the vertex cache object associated when the rendering function is called can be obtained, and the current running of the application can be obtained in real time Status for corresponding processing.
  • FIG. 6 is a schematic flowchart of identifying a preset texture picture in rendering data according to an embodiment of the present application.
  • FIG. 6 may be a further description of step 203 in FIG. 2 or a further description of the same or similar steps in other embodiments.
  • the texture mapping operation when rendering graphics is also called texture mapping operation, which is to paste the texture image as the texture on the surface of the object to be rendered to enhance the realism.
  • the texture mapping operation is carried out in units of triangles. You need to specify the position of each vertex of the triangle on the surface of the object in the map, and then map.
  • the object to be rendered is an object in the game.
  • the texture mapping operation may be performed by the glDrawElements function, for example.
  • the simulator when the simulator detects that the application program renders the game frame, + can be implemented by detecting the application program's call to the glDrawElements function that performs texture mapping.
  • the simulator analyzes the vertex data in the VBO used by this function to obtain the label of the texture corresponding to the vertex to be rendered.
  • the simulator can recognize the texture used to render the game frame based on the texture identification.
  • the simulator obtains texture coordinates of the texture image in the texture corresponding to the vertex from the VBO. Through the texture coordinates of the texture image, you can locate the corresponding texture image in the texture.
  • Step 203 may include:
  • step 601 the computing device determines whether the preset data including the preset texture image is included in the rendering data.
  • the simulator may determine whether the preset texture where the preset texture picture is located is included in the rendering data by comparing the identifier of the texture included in the rendering data in the VBO with the identifier of the recorded preset texture. When the identifier of the texture in the rendering data matches the identifier of the recorded preset texture, the simulator determines that the identifier of the preset texture where the preset texture image is included in the rendering data includes, There are texture images in the currently rendered virtual scene that need further processing, for example, setting corresponding physical buttons.
  • FIG. 7 is a detailed description of steps 601 and 602 in FIG. 6. As shown in FIG. 7, step 601 may include:
  • Step 701 The computing device compares the identification of the texture in the rendering data with the recorded identification of the preset texture
  • Step 702 When the identifier of the texture in the rendering data matches the recorded identifier of the preset texture, the computing device determines that the rendering data includes the preset texture where the preset texture image is located.
  • the computing device runs the application through the simulator, it will first load all textures that may be used in the application from the hard disk into the memory. Therefore, when the emulator starts the application, it will detect that the virtual operating system loads the application data from the hard disk to the graphics card memory.
  • the application data includes textures used by the application.
  • the glTexImage2D function is called when the texture is loaded into the memory.
  • the simulator can obtain the cyclic redundancy check code for each texture.
  • the simulator verifies the cyclic redundancy check code of the acquired texture, and compares it with the cyclic redundancy check code of the preset texture in the texture picture configuration file to determine the cyclic redundancy check of the texture in the application data Whether the verification code matches the cyclic redundancy check code of the preset texture.
  • the identifier (ID) of the texture in the application data is recorded ) As an identifier of the preset texture.
  • the identification of the texture may be a numeric character string, which is not specifically limited in this embodiment of the present application.
  • the cyclic redundancy check code of the texture is unchanged. Since the application uses the texture logo when using the texture, it is necessary to configure the cyclic redundancy check code of the texture to be detected in the texture picture configuration file, and determine the application data according to the cyclic redundancy check code of the texture The ID of the texture to detect whether the texture is used by the application.
  • FIG. 8 is a further schematic flowchart of a method for virtual scene recognition and interactive key matching of an application program according to an embodiment of the present application. As shown in FIG. 8, after the application is started in step 201, the method may include:
  • Step 801 the computing device detects that the virtual operating system loads the application program data from the hard disk to the graphics card memory;
  • Step 802 The computing device checks the cyclic redundancy check code of the texture in the application data, and determines whether the cyclic redundancy check code of the texture in the application data is the same as the cyclic redundancy of the preset texture The remaining check codes match;
  • Step 803 When determining that the cyclic redundancy check code of the texture in the application data matches the cyclic redundancy check code of the preset texture, the computing device records the texture of the texture in the application data The mark serves as the mark of the preset texture.
  • the simulator when the simulator detects that the virtual operating system loads the application data from the hard disk to the graphics card memory through the glTexImage2D function, it will verify the cyclic redundancy check code of the texture in the application and compare the The cyclic redundancy check code of the texture and the cyclic redundancy check code of the preset texture in the texture picture configuration file, when the two match, the identification of the texture in the application data is recorded as the identifier of the preset texture .
  • Step 602 when determining that the rendering data includes the preset texture where the preset texture image is located, the computing device determines whether the rendering data includes the preset texture image in the preset texture image location information.
  • determining whether the rendering data includes the position information of the preset texture image in the preset texture image includes:
  • Step 703 The computing device compares the texture coordinates of the texture image in the rendering data with the texture coordinates of the preset texture image in the preset texture image;
  • Step 704 When the texture coordinates of the texture picture in the rendering data match the texture coordinates of the preset texture picture in the preset texture, the computing device determines that the rendering data includes the preset Position information of the texture picture in the preset texture.
  • the simulator can compare the texture coordinates of the texture image corresponding to the vertex rendered in the rendering data with the texture coordinates of the preset texture image configured in the texture image configuration file, and when the two match, determine The rendering data includes position information of the preset texture image in the preset texture image.
  • the VBO used by the rendering function glDrawElements function will include the texture coordinates of the texture image used by the rendering function, that is, the location coordinates of the texture image in the texture.
  • the coordinate range of the texture or texture image may be, for example, from (0, 0) to (1, 1), and this coordinate is a coordinate subjected to normalization processing.
  • the upper left corner of a texture is (0, 0)
  • the lower right corner is (1, 1).
  • the position of each texture picture in the texture in the texture can be represented by the position of the four corners of each texture picture in the coordinate system of the texture. Refer to the texture 34 and the texture picture 35 shown in FIG. 3B.
  • the simulator can analyze the VBO used by the glDrawElements function when recognizing the texture picture in the game frame, find the texture 34 mark, and then find the texture coordinate as ⁇ (0.000000, 0.462891), (0.000000,0.255859), (0.207031,0.255859), (0.207031,0.462891) ⁇ , you can confirm that the texture picture 35 in the texture 34 is used in the current game frame according to the found texture coordinates.
  • Step 603 when determining that the rendering data includes the position information of the preset texture image in the preset texture image, the computing device determines that the rendering data used by the rendering function includes the preset texture image .
  • the simulator determines the rendering data when determining that the rendering data in the VBO includes the preset texture where the preset texture picture is located, and the texture coordinates of the preset texture picture in the preset texture where it is located There is a preset texture picture in the texture picture configuration file to be detected.
  • the texture used for each rendering The texture coordinates of the picture are the same, so by detecting the rendering function called when rendering the game frame, looking for the rendering data used by the rendering function, to determine whether the rendering data includes the preset texture where the preset texture image is located, and determine the rendering When the data includes the preset texture, determine whether the rendering data includes position information of the preset texture image in the preset texture, and determine whether the preset data includes the preset
  • the location information of the texture image in the preset texture is determined that the rendering data used by the rendering function includes the preset texture image, only a few coordinate values need to be compared to accurately determine the game state, Quickly know whether a certain texture image is to be displayed by the game, the number of bytes processed is relatively small, making the calculation design Frame rendering games faster, required less computing resource overhead, higher performance.
  • FIG. 9 is provided according to an embodiment of the present application, when it is recognized that the virtual scene includes a preset texture picture, the corresponding key configuration file is called to complete the texture picture and the corresponding physical key of the computing device Schematic diagram of the matching process.
  • FIG. 9 takes the further description of step 204 in FIG. 2 as an example, and at the same time, FIG. 9 is also applicable to the description of the same or similar steps in the foregoing various other embodiments.
  • step 204 may include:
  • Step 901 when it is recognized that the virtual scene includes a preset texture picture, the computing device calls the corresponding key configuration file, and determines the location according to the preset texture picture and the corresponding key information in the key configuration file The physical key corresponding to the preset texture picture enables the physical key to take effect.
  • the simulator can determine whether there is a corresponding physical button to be set in the current virtual scene by comparing the texture image in the texture image configuration file with the texture image in the rendering data used by the rendering function for rendering the virtual scene After the preset texture picture of, find the key information corresponding to the preset texture picture in the key configuration file. For example, the simulator may compare the ID of the preset texture image obtained from the texture image configuration file with the identity of the texture image in the key configuration file, and after the two match, confirm the texture according to the key configuration file The name of the physical key corresponding to the logo of the picture, and the screen position coordinates corresponding to the physical key.
  • the example key configuration file is as follows:
  • the key configuration file is configured with the texture image identifier (TextureId) as "30", the name (Name) as “GetIntoCar”, and the comment (Comm) as "drive in the car”.
  • the key configuration file matches the texture image "GetIntoCar” in the key configuration file, the screen position coordinates (0.656250, 0.419444) corresponding to the key "F" are obtained.
  • the simulator finds the name Itemname "GetIntoCar” corresponding to the texture image with TextureId "30" according to the identification of the texture image configuration file of the preceding example, and then enables the texture image of GetIntoCar through the EnableSwitch in the SwitchOperation statement.
  • the corresponding key is "F”
  • the horizontal coordinate of the screen position corresponding to the key F is 0.656250
  • the vertical coordinate is 0.419444, that is, the key F on the keyboard will take effect.
  • the coordinate of the screen position corresponding to the physical key may be, for example, the position of the center of the key icon of the physical key on the display screen.
  • multiple SwitchOperation statements can be used to configure the same physical button to take effect when different texture pictures take effect.
  • the top priority of the texture picture is the highest.
  • the SwitchOperation statement can also carry the screen position coordinates corresponding to the physical keys. In this case, after a texture image becomes effective, other texture images are invalid. The coordinates of the screen position corresponding to the physical key can only be the coordinates in the SwitchOperation statement where the texture image in effect is located. This configuration is more flexible.
  • Step 902 Determine the operation position of the physical key corresponding to the preset texture picture on the display screen.
  • the simulator may, for example, determine the operating position of the corresponding physical key on the display screen according to the screen position coordinates corresponding to the physical key in the key configuration file; or according to the The display position of the set sticker picture on the display screen determines the operation position of the corresponding physical key on the display screen.
  • this determination method is simple and versatile, which is inconvenient to display on the screen according to the preset texture picture
  • the display position on the screen can also be applied when the operation position of the physical key on the display screen is determined.
  • the simulator detects the texture picture of this picture, it will know that the game is over , The user needs to confirm whether to quit, but the position where this picture is located cannot be operated, and the operation needs to be performed at the position of the exit button. Therefore, in this case, the operation position of the physical key performing the exit operation on the display screen will be configured in the key configuration file in advance.
  • the emulator detects that the square texture picture is included in the game scene, it will set a button icon of the corresponding physical button at the position of the exit button for the user to perform the exit operation.
  • determining the operation position of the corresponding physical key on the display screen may include the following steps:
  • the operation position of the corresponding physical key on the display screen is calculated.
  • this vertex coordinate may be, for example, the vertex coordinates of a triangle constituting various operation buttons.
  • this vertex coordinate may be based on the screen position of the second operating system, for example, the position of the vertex under the Android operating system on the display screen.
  • the upper left corner on the display screen is the origin of coordinates (0,0)
  • the lower right corner is the maximum coordinate. For example, if the screen resolution under the Android operating system is 1024 * 576, the maximum value of this coordinate is (1024, 576).
  • each triangle has three vertices.
  • the first triangle includes vertices 1, 2, 3, and the second triangle includes vertices 4. , 5, 6.
  • two vertices are coincident.
  • the computing device can obtain the vertex coordinates of the vertices of the two triangles.
  • the simulator can calculate the display position of the texture picture on the display screen according to the coordinates of the vertices of the triangle in the game frame.
  • the simulator can calculate the texture image under the first operating system through model transformation (Model), view transformation (View), and projection transformation (Projection) according to the coordinates of the vertices of the triangle in the game frame.
  • the actual display position and display area on the display screen are referred to as MVP transformation for short.
  • the simulator can calculate the display screen of the texture image pasted on the triangle in the game frame under the first operating system according to the vertex coordinates of the triangle in the game frame On the display position.
  • the simulator may first determine the display area of the sticker image on the display screen according to the display position of the sticker image on the display screen under the first operating system, and then display the sticker image on the display screen Area, it is determined that the operation position of the physical key corresponding to the texture picture on the display screen is located in the display area.
  • the virtual scene recognition and interactive key matching method of an application program by searching key information corresponding to a preset texture picture in a key configuration file, the corresponding physical key is determined, and the physical key is determined to be displayed
  • the operation position on the screen is used to set the physical buttons. Since the computing device can set the corresponding physical buttons in real time and dynamically and make them effective, the game player can operate the game in real time and have a better game operation experience.
  • FIG. 10 is a further schematic flowchart of a virtual scene recognition and interactive key matching method for an application program according to an embodiment of the present application. As shown in FIG. 10, in addition to the steps shown in FIG. 2, the virtual scene recognition and interactive key matching method of the application of the present application further includes:
  • Step 1001 Detect an operation message input to the physical key corresponding to the preset texture picture through the input and output device under the first operating system.
  • the simulator will detect an operation message input by a user through an input and output device such as a keyboard and a mouse under the first operating system of the computing device.
  • the operation message is used to display a virtual scene displayed on the display screen.
  • the operation message of the key icon in the operation is used to display a virtual scene displayed on the display screen.
  • the emulator will use a Hook Windows message to determine that the game player pressed a physical key on the keyboard or mouse, and then the emulator will check whether the physical key is effective in the key configuration file. When it is determined that the game player has pressed the corresponding physical key on the keyboard or mouse, the operation is judged to be effective.
  • the aforementioned corresponding physical keys correspond to the key icons in the game scene displayed on the display screen.
  • Step 1002 Write the operation position of the physical key corresponding to the preset texture picture on the display screen into the touch screen device file, and the touch screen driver reads the operation position from the touch screen device file and sends the operation position Into the application program, the application program performs interactive operations with the virtual scene at the operation position.
  • the simulator can obtain the operation position of the physical key to be set on the display screen through the key configuration file, or obtain the operation position of the physical key to be set on the display screen according to the location information of the texture picture.
  • the emulator detects that the game player triggers an operation on the key icon displayed on the display screen
  • the operation data corresponding to the physical key is written into the touch screen device file of the Android operating system.
  • the operation data corresponding to the physical key may be, for example, the operating position of the physical key on the display screen, which is not specifically limited in this embodiment of the present application.
  • the Android operating system on mobile phones is usually developed based on the Linux kernel and can be divided into a kernel layer and an application layer.
  • the application layer can be understood as the running environment of the application program (App), in which the application program can be a game; the kernel layer provides basic services for the application layer.
  • the kernel layer there are drivers for various types of hardware, called hardware drivers.
  • the hardware driver can know the hardware data by reading and writing device files of the Android operating system.
  • the hardware driver sends the acquired hardware data to the upper-layer application and runs the application.
  • there are various hardware drivers and corresponding device files such as touch screen drivers, Bluetooth drivers, audio drivers, and so on.
  • the screen position coordinates (x, y) generated by the touch are transmitted as touch data to the touch screen device file.
  • the touch screen driver of the kernel layer reads the touch screen device file to obtain touch data of the touch screen being touched, and then transmits the touch data to the upper layer application.
  • the upper layer applications of the Android operating system, such as games, then respond to the touched screen message according to the touch data.
  • the PC emulator when simulating an Android mobile phone game with a PC emulator, calculates the physical key of the keyboard or mouse corresponding to the physical key of the keyboard or mouse when the game frame is to be rendered by the location information of the texture picture After the screen position coordinates, write the screen position coordinates corresponding to the physical keys to the touch screen device file as touch data to simulate Android's touch message. That is, the PC emulator simulates the process of writing touch screen device files by the touch screen hardware.
  • a game player presses a physical key on the keyboard during game play, and when the computing device detects that the physical key matches a certain key icon in the game frame displayed on the display screen, it calculates The device may consider that the pressing of physical keys on the keyboard is an effective operation. Furthermore, the computing device can find the screen position coordinates (x, y) where the texture picture corresponding to the physical key on the display screen is located, and then write the screen position coordinates into the touch screen device file.
  • the computing device calls the touch screen driver to read the operation data written in the file of the touch screen device, for example, the screen position coordinates, and uploads the screen position coordinates to the upper-layer application to generate a message that the display screen is pressed, Clicking on the display screen while simulating a touch screen.
  • the touch screen driver sent to the Android operating system you can know that the game player pressed the car button.
  • the computing device After obtaining the screen position coordinates of the physical key read by the touch screen driver, the computing device performs a click operation at the coordinate point to implement the operation of the virtual operation button in the game frame represented by the texture picture corresponding to the physical key. After that, the computing device outputs the result after the virtual operation button is clicked on the display screen according to the game logic.
  • the virtual scene recognition and interactive key matching method of an application program by detecting the physical key corresponding to the preset texture image input through the input and output device under the first operating system on the display screen Operation position, write the operation position of the physical key on the display screen into the touch screen device file, and the touch screen driver reads the written operation position from the touch screen device file, and sends the operation position to the application
  • the application program performs interactive operations with the virtual scene at the operation position, so that the user can use the keyboard and mouse to operate the game.
  • FIG. 11 is a schematic diagram of an apparatus for virtual scene recognition and interactive key matching of an application program according to an embodiment of the present application.
  • the apparatus for virtual scene recognition and interactive key matching of an application program according to an embodiment of the present application may include:
  • the startup module 1101 is used to start an application program in a virtual operating system
  • the rendering detection module 1102 is configured to perform rendering processing on the virtual scene to be presented by the application program to obtain rendering data used to render the virtual scene;
  • the texture image recognition module 1103 is used to identify whether the preset texture image is included in the rendering data
  • the key position setting module 1104 is configured to, when it is recognized that the rendering data includes a preset texture picture, call a corresponding key position configuration file to execute the corresponding physical key of the preset texture picture and the corresponding physical key of the computing device Matching processing
  • the display module 1105 is configured to present the virtual scene and key configuration information corresponding to the virtual scene on the display screen of the computing device, and the key configuration information is associated with the matched physical keys.
  • the rendering detection module 1102 is configured to obtain rendering data in a vertex cache object when the rendering function is called by detecting the call of a rendering function by the application program, and store the rendering data in the vertex cache object
  • the rendering data is used as rendering data for rendering the virtual scene
  • the rendering function is used to render the virtual scene.
  • rendering processing is performed on a virtual scene to be presented of the application program, to obtain rendering data used to render the virtual scene, and to identify a preset in the rendering data Texture image, when it is recognized that there is a preset texture image in the virtual scene, the corresponding key configuration file is called to complete the matching processing between the texture image and the corresponding physical key of the computing device.
  • the virtual scene and the button configuration information corresponding to the virtual scene are presented on the screen of the computing device.
  • the embodiments of the present application can quickly and efficiently identify game scenes based on the recognition of the texture pictures in the mobile game application, and dynamically recognize
  • the game operation buttons are automatically and reasonably arranged in the game scenes to make the game operation button settings intelligent, and game players can easily use the mouse and keyboard to experience the game. There is no need to set a large number of game operation buttons by themselves, so that game players can have better Game operation experience, so this kind of game scene recognition Key arrangement is more intelligent, better results.
  • the sticker image recognition module 1103 may include:
  • the texture determining module 1201 is configured to determine whether the preset data where the preset texture image is located is included in the rendering data;
  • the texture coordinate judging module 1202 is configured to judge whether the preset texture is included in the rendering data when the texture judging module 1201 determines that the rendering data includes a preset texture where the preset texture image is located Position information of the picture in the preset map;
  • the texture picture confirmation module 1203 is configured to determine the rendering used by the rendering function when the texture coordinate judgment module 1202 judges that the rendering data includes the position information of the preset texture picture in the preset texture The data includes the preset texture picture.
  • the texture determination module 1201 may include: a texture identification comparison module 1301 for comparing the texture identification in the rendering data and the recorded predetermined texture identification; texture confirmation The module 1302 is configured to determine that the preset texture image is included in the rendering data when the texture label comparison module 1301 determines that the label of the texture in the rendering data matches the recorded label of the preset texture Where the preset texture is.
  • the texture coordinate determination module 1202 may include: a texture coordinate comparison module 1303 for comparing texture coordinates of the texture image in the rendering data with texture coordinates of the preset texture image in the preset texture; texture coordinate determination Module 1304, configured to determine the rendering when the texture coordinate comparison module 1303 determines the texture coordinates of the texture picture in the rendering data and the texture coordinates of the preset texture picture in the preset texture
  • the data includes position information of the preset texture image in the preset texture image.
  • FIG. 14 is a schematic diagram of a virtual scene recognition and interactive key matching device for another application program according to an embodiment of the present application. As shown in FIG. 14, on the basis of FIG. 13, the virtual scene recognition and interaction key matching device of the application further includes the following modules:
  • the texture loading detection module 1401 is used to detect whether the virtual operating system loads application data from the hard disk to the graphics card memory;
  • the verification module 1402 is configured to verify the cyclic redundancy check code of the texture in the application data after detecting that the virtual operating system loads the application data from the hard disk into the graphics card memory and determine the application Whether the cyclic redundancy check code of the texture in the program data matches the cyclic redundancy check code of the preset texture;
  • the recording module 1403 is configured to record the application data when the verification module determines that the cyclic redundancy check code of the texture in the application data matches the cyclic redundancy check code of the preset texture
  • the identifier of the texture in is used as the identifier of the preset texture as the identifier of the texture in the application data.
  • the texture used for each rendering since the texture coordinates of the texture image in the texture in the game are fixed, for a texture image in the texture, the texture used for each rendering The texture coordinates of the pictures are the same. Therefore, by detecting the rendering function called when rendering the game frame, it is determined whether the preset data of the preset texture picture is included in the rendering data, and the predetermined texture is included in the determination rendering data When it is determined whether the rendering data includes the position information of the preset texture image in the preset texture, and when determining that the rendering data includes the preset texture image in the preset texture When determining the location information in the rendering function, it is determined that the rendering data used by the rendering function includes a preset texture image. Only a few coordinate values need to be compared to quickly know whether a texture image needs to be displayed by the game. The small number of bytes makes the computing device render game frames faster and requires less computer resource overhead. Higher.
  • the key determination module 1104 includes:
  • Key determination module 1501 used to call the corresponding key configuration file when it is recognized that the rendering data includes a preset texture image, according to the preset texture image and corresponding key information in the key configuration file, Determine the physical button corresponding to the preset texture picture, and enable the physical button to take effect;
  • Key position determination module 1502 used to determine the operation position of the physical key corresponding to the preset texture picture on the display screen.
  • the virtual scene recognition and interactive key matching device of the application program by searching key information corresponding to a preset texture picture in the key configuration file, the corresponding physical key is determined, and the physical key is determined to be
  • the operation position on the display screen that is, the computing device can set the corresponding physical buttons in real time and dynamically and make them effective, so that the game player can operate the game in real time and have a better game operation experience.
  • computing devices may have relatively large differences due to different configurations or performances, and may include one or more central processing units (CPUs) 1601 (for example, one or more processors) and memory 1602, one or more storage media 1605 (for example, one or more mass storage devices) storing application programs 1603 or data 1604.
  • the memory 1602 and the storage medium 1605 may be short-term storage or persistent storage.
  • the program stored in the storage medium 1605 may include one or more modules (not shown in the figure), and each module may include a series of instruction operations on the computing device.
  • the central processor 1601 may be configured to communicate with the storage medium 1605 and execute a series of instruction operations in the storage medium 1605 on the computing device.
  • the application program 1603 stored in the storage medium 1605 includes the application program of the virtual scene recognition and interactive key matching device of the application program, and the program may include the virtual scene recognition and interactive key matching device of the above application program.
  • the central processor 1601 may be configured to communicate with the storage medium 1605 and execute a series of operations corresponding to the game operation application stored in the storage medium 1605 on the computing device.
  • the game operation application may also include the operations described in the flowcharts of the methods.
  • the computing device may also include one or more power supplies 1606, one or more wired or wireless network interfaces 1607, one or more input and output interfaces 1608, and / or, one or more operating systems 1609, such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
  • power supplies 1606, one or more wired or wireless network interfaces 1607, one or more input and output interfaces 1608, and / or, one or more operating systems 1609 such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
  • An embodiment of the present application provides a computing device capable of constructing or running a virtual operating system.
  • the computing device includes: a processor and a memory, and the memory stores computer-readable instructions, and the computer-readable instructions It is executed by the processor to complete the virtual scene recognition and interaction key matching method of the application program described in the foregoing method embodiments.
  • a computing device by starting an application in a virtual operating system, rendering processing is performed on a virtual scene to be presented of the application, to obtain rendering data for rendering the virtual scene , Identify the preset texture picture in the rendering data, and when it is recognized that there is a preset texture picture in the virtual scene, call the corresponding key configuration file to complete the correspondence between the texture picture and the computing device Matching processing of physical keys, presenting the virtual scene and key configuration information corresponding to the virtual scene on the screen of the computing device.
  • the embodiments of the present application can be based on the recognition of texture pictures in mobile game applications, quickly and efficiently Identify game scenes, dynamically and automatically lay out game operation buttons in the recognized game scenes in real time and dynamically, making game operation button settings intelligent, gamers can easily use the mouse and keyboard to experience the game, and no longer need to set up a lot of game operations Buttons enable gamers to have better game operation bodies Therefore, this key setting method based on game scene recognition is more intelligent and has better effects.
  • These computer program instructions can be provided to the processor of a general-purpose computer, special-purpose computer, embedded processing machine, or other programmable data processing device to produce a machine that enables the generation of instructions executed by the processor of the computer or other programmable data processing device
  • These computer program instructions may also be stored in a computer-readable memory that can guide a computer or other programmable data processing device to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including an instruction device, the instructions The device implements the functions specified in one block or multiple blocks of the flowchart one flow or multiple flows and / or block diagrams.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device, so that a series of operating steps are performed on the computer or other programmable device to produce computer-implemented processing, which is executed on the computer or other programmable device
  • the instructions provide steps for implementing the functions specified in one block or multiple blocks of the flowchart one flow or multiple flows and / or block diagrams.
  • the functional modules in the embodiments of the present application may be integrated into one processing unit, or each module may exist alone physically, or two or more modules may be integrated into one unit.
  • the above integrated unit may be implemented in the form of hardware or software functional unit.
  • each instance of the present application can be realized by a data processing program executed by a data processing device such as a computer.
  • the data processing program constitutes this application.
  • the data processing program usually stored in one storage medium is executed by directly reading the program out of the storage medium or by installing or copying the program into a storage device (such as a hard disk and or memory) of the data processing device. Therefore, such a storage medium also constitutes the present application.
  • Storage media can use any type of recording method, such as paper storage media (such as paper tape, etc.), magnetic storage media (such as floppy disk, hard disk, flash memory, etc.), optical storage media (such as CD-ROM, etc.), magneto-optical storage media ( Such as MO, etc.).
  • the present application also provides a non-volatile storage medium in which a game operation program is stored, and the game operation program is used to execute any one of the examples in the methods of the foregoing embodiments of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Stored Programmes (AREA)

Abstract

一种应用程序的虚拟场景识别与交互键位匹配方法及计算设备。其中方法包括:在虚拟操作系统中启动应用程序(201);对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所使用的渲染数据(202);识别所述渲染数据中是否包括预设的贴图图片(203);当识别到所述渲染数据中包括预设的贴图图片时,调用键位配置文件,执行所述预设的贴图图片与所述计算设备的对应实体按键的匹配处理(204);在显示屏幕上呈现所述虚拟场景及与所述虚拟场景对应的按键配置信息(205)。通过所述方法,能够基于对手机游戏应用程序中贴图图片的识别,快速高效地识别游戏场景,实时动态地在游戏场景中布局游戏操作按键,使得游戏操作按键设置智能化。

Description

应用程序的虚拟场景识别与交互键位匹配方法及计算设备
本申请要求于2018年11月09日提交的申请号为201811330926.1、发明名称为“应用程序的虚拟场景识别与交互键位匹配方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及互联网技术领域,特别是涉及一种应用程序的虚拟场景识别与交互键位匹配方法及计算设备。
背景技术
随着游戏技术的发展,游戏开发商开发了许多在手机上运行的游戏,例如,用于安卓操作系统的手机游戏、用于iOS操作系统的手机游戏等等。由于游戏玩家在玩手机游戏时,通常是通过触摸屏进行操作的,因此,手机游戏在设计时,通常仅被设计为支持触摸操作,不支持鼠标和键盘操作。然而,一方面由于手机屏幕尺寸的限制,在手机上很难有较好的视觉体验和视觉效果;另一方面,在手机上,通过触摸屏幕也很难对手机游戏进行很多较为复杂的操作。因此,为了达到更好的视觉体验和视觉效果以及对手机游戏进行更为复杂的操作,有时候游戏玩家希望在其个人电脑(PC,Personal Computer)上玩手机游戏,并通过使用鼠标和键盘来实现对手机游戏的操作。
针对这种情况,需要将支持触摸操作的手机游戏转换为支持鼠标和键盘操作的电脑游戏。目前,市面上有PC模拟器用鼠标和键盘模拟触摸操作实现这种转换功能。在这种PC模拟器中,需要游戏玩家在游戏之前或者在游戏过程中自行设置游戏操作按键,之后才能通过游戏操作按键来操作手机游戏。但由于手机游戏越来越趋于复杂化,操作方式也越来越复杂,因此可能需要游戏玩家自行设置大量的游戏操作按键,由于人工设置游戏操作按键的方式缺乏智能性,所以这种PC模拟器中的按键设置和游戏操作方式已不能够满足游戏玩家玩手机游戏时对游戏操作体验的要求。
发明内容
本申请实施例提供了一种应用程序的虚拟场景识别与交互键位匹配方法、装置、计算机存储介质以及计算设备。
根据本申请实施例,所述应用程序的虚拟场景识别与交互键位匹配方法,应用于能够构建或运行虚拟操作系统的计算设备,所述应用程序的虚拟场景识别与交互键位匹配方法包括:
在所述虚拟操作系统中启动应用程序;
对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所使用的渲染数据;
当识别到所述渲染数据中包括预设的贴图图片时,调用键位配置文件,执行所述预设的贴图图片与所述计算设备的对应实体按键的匹配处理;
在所述计算设备的显示屏幕上呈现所述虚拟场景及与所述虚拟场景对应的按键配置信息,所述按键配置信息与匹配到的实体按键关联。
根据本申请实施例,所述应用程序的虚拟场景识别与交互键位匹配装置,应用于能够构建或运行虚拟操作系统的计算设备,所述应用程序的虚拟场景识别与交互键位匹配装置包括:
启动模块,用于在所述虚拟操作系统中启动应用程序;
渲染检测模块,用于对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所使用的渲染数据;
贴图图片识别模块,用于识别渲染检测模块得到的所述渲染数据中是否包括预设的贴图图片;
键位设置模块,用于当贴图图片识别模块识别到所述虚拟场景中包括预设的贴图图片时,调用对应的键位配置文件,执行所述预设的贴图图片与所述计算设备的对应实体按键的匹配处理;
显示模块,用于在所述计算设备的显示屏幕上呈现所述虚拟场景及与所述虚拟场景对应的按键配置信息,所述按键配置信息与匹配到的实体按键关联。
根据本申请实施例,所述计算机存储介质存储有多条指令,所述指令适于由处理器加载并执行以下步骤:
在所述虚拟操作系统中启动应用程序;
对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所使用的渲染数据;
当识别到所述渲染数据中包括预设的贴图图片时,调用键位配置文件,执行所述预设的贴图图片与所述计算设备的对应实体按键的匹配处理;
在所述计算设备的显示屏幕上呈现所述虚拟场景及与所述虚拟场景对应的按键配置信息,所述按键配置信息与匹配到的实体按键关联。
根据本申请实施例,所述计算设备能够构建或运行虚拟操作系统,所述计算设备包括:处理器和存储器,所述存储器上存储有计算机可读指令,所述计算机可读指令由所述处理器执行以完成以下操作:
在所述虚拟操作系统中启动应用程序;
对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所使用的渲染数据;
当识别到所述渲染数据中包括预设的贴图图片时,调用键位配置文件,执行所述预设的贴图图片与所述计算设备的对应实体按键的匹配处理;
在所述计算设备的显示屏幕上呈现所述虚拟场景及与所述虚拟场景对应的按键配置信息,所述按键配置信息与匹配到的实体按键关联。
在本申请实施例中,通过在虚拟操作系统中启动应用程序,对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所用的渲染数据,识别所述渲染数据中预设的贴图图片,当识别到所述虚拟场景中有预设的贴图图片时,调用对应的键位配置文件,以完成所述贴图图片与所述计算设备的对应实体按键的匹配处理,在所述计算设备的屏幕上呈现所述虚拟场景及与该虚拟场景对应的按键配置信息,本申请实施例能够基于对手机游戏应用程序中贴图图片的识别,快速高效地识别游戏场景,实时动态地在识别到的游戏场景中自动合理布局游戏操作按键,使得游戏操作按键设置智能化,游戏玩家能很方便地使用鼠标和键盘体验游戏,不再需要自行设置大量游戏操作按键,使得游戏玩家能够有较好的游戏操作体验,因此该种基于游戏场景识别的按键设置方式较为智能化,效果较佳。
附图说明
为了更清楚地说明本申请中的技术方案,下面将对实施例描述中所需要使用的附图作简单的介绍。
图1为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配方法及装置的应用环境示意图;
图2为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配方法的流程示意图;
图3A为根据本申请实施例提供的示例的手机游戏中的贴图和贴图图片关系示意图;
图3B为根据本申请实施例提供的示例的手机游戏中的贴图图片在贴图中的位置信息示意图;
图4A为根据本申请实施例提供的一个示例游戏场景的示意图;
图4B为根据本申请实施例提供的另一个示例游戏场景的示意图;
图5为根据本申请实施例提供的对应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所用的渲染数据的流程示意图;
图6为根据本申请实施例提供的识别渲染数据中预设的贴图图片的流程示意图;
图7为根据本申请实施例提供的对图6中的步骤601、602进行详细描述的流程示意图;
图8为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配方法的进一步流程示意图;
图9为根据本申请实施例提供的当识别到所述虚拟场景中有预设的贴图图片时,调用对应的键位配置文件,以完成所述贴图图片与所述计算设备的对应实体按键的匹配处理的流程示意图;
图10为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配方法的进一步流程示意图;
图11为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配装置示意图;
图12为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配装置中的贴图图片识别模块的示意图;
图13为根据本申请实施例提供的贴图图片识别模块中的贴图判断模块和纹理坐标判断模块的示意图;
图14为根据本申请实施例提供的对应用程序的虚拟场景识别与交互键位匹配装置的进一步描述示意图;
图15为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配装置的键位设置模块示意图;
图16为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配装置所在的计算设备的硬件结构示意图。
具体实施方式
以下结合说明书附图及具体实施例进一步说明本申请。应当理解,此处所描述的具体实施例仅用以解释本申请,并不用于限定本申请。
本申请实施例提出了一种应用程序的虚拟场景识别与交互键位匹配方法和装置,应用于能够构建或运行虚拟操作系统的计算设备,该虚拟操作系统用于运行前述应用程序。在一种可能的实现方式中,前述应用程序可以是安装在计算设备上的手机游戏应用程序,本申请实施例对此不进行具体限定。
通过使用本申请实施例的应用程序的虚拟场景识别与交互键位匹配方法,在计算设备运行手机游戏应用程序的过程中,能够基于对手机游戏应用程序中贴图图片的识别,快速高效地识别游戏场景,实时动态地在识别到的游戏场景中自动合理布局游戏操作按键,使得游戏操作按键设置智能化,游戏玩家无需在游戏场景中自行设置游戏操作按键,仅需根据计算设备自动设置的游戏操作按键,通过操作键盘和鼠标上对应的实体按键,便可实现对手机游戏进行操作,该种基于游戏场景识别的按键设置方式能够让游戏玩家有较好的游戏操作体验,较为智能化,效果较佳。
图1为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配方法和装置的应用环境示意图。如图1所示,该应用环境包括计算设备10和服务器20。其中计算设备10和服务器20可以为个人计算机。计算设备10配置有键盘11和鼠标12。在计算设备10上安装有手机游戏应用程序以及应用程序的虚拟场景识别与交互键位匹配装置30。应用程序的虚拟场景识别与交互键位匹配装置30可以体现为一个PC模拟器的形式,例如,模拟安卓手机游戏的PC安卓模拟器。在下文中PC模拟器也简称为模拟器。
所述应用程序的虚拟场景识别与交互键位匹配装置30可以执行应用程序的虚拟场景识别与交互键位匹配方法,用于在虚拟操作系统中启动应用程序;对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所用的渲染数据;识别所述渲染数据中预设的贴图图片;当识别到所述虚拟场景中有预设的贴图图片时,调用对应的键位配置文件,以完成所述贴图图片与所述计算设备的对应实体按键的匹配处理;在所述计算设备的显示屏幕上呈现所述 虚拟场景及与该虚拟场景对应的按键配置信息。
服务器端20可以用于配合手机游戏的更新,向计算设备10下发更新的预配置数据,例如预配置数据包括需要在游戏中设置的按键的操作坐标数据等等。本申请实施例中的按键包括键盘11上的按键和鼠标12的左键、右键等等。在本申请各个实施例的描述中,“手机游戏”也简称为“游戏”。
图2为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配方法的流程示意图。所述应用程序的虚拟场景识别与交互键位匹配方法可以应用于计算设备上,例如个人计算机上。如图2所示,该应用程序的虚拟场景识别与交互键位匹配方法包括以下步骤:
步骤201,计算设备在虚拟操作系统中启动应用程序。
根据本申请实施例,例如在计算设备上运行有第一操作系统,而应用程序是适用于第二操作系统的应用程序,因此,当需要在计算设备上运行该适用于第二操作系统的应用程序时,需要在运行第一操作系统的计算设备上构建并运行一个和第二操作系统相同的虚拟操作系统,用以运行该适用于第二操作系统的应用程序。一种可行的方式为,通过在计算设备上安装一个模拟器,用虚拟机构造一个和第二操作系统相同的虚拟操作系统,给应用程序提供运行环境。
第一操作系统例如为Windows、Linux等操作系统,第二操作系统例如为安卓操作系统,虚拟操作系统例如为虚拟安卓操作系统。虚拟机(Virtual Machine)是指通过软件模拟的具有完整硬件系统功能的、运行在一个完全隔离环境中的完整计算机系统。虚拟操作系统是第二操作系统的全新虚拟镜像,它具有和真实的第二操作系统完全一样的功能,进入虚拟操作系统后,操作都是在这个独立的虚拟操作系统中进行,例如,虚拟操作系统可以独立安装运行软件,保存数据,有自己的独立桌面,不会影响计算设备上的第一操作系统。在一种可能的实现方式中,本申请实施例的虚拟机例如可为VirtualBox虚拟机。
例如,在计算设备上运行有Windows操作系统,而手机游戏应用程序适用于安卓操作系统,当要在计算设备上运行手机游戏应用程序时,在计算设备上可以运行模拟器,模拟器用虚拟机构造一个虚拟安卓操作系统,给手机游戏应用程序提供运行环境。在运行模拟器后,用户可以在模拟器的界面中下载手机游戏应用程序,启动手机游戏应用程序。
步骤202,计算设备对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所使用的渲染数据。
根据本申请实施例,应用程序在虚拟操作系统上运行时,可能会根据应用程序本身的运行逻辑需要在显示屏幕上呈现各种虚拟场景。而应用程序在显示屏幕上呈现这些虚拟场景之前,需要先在后台对这些待呈现的虚拟场景进行相应的渲染处理操作。本申请实施例通过检测应用程序在后台的渲染处理操作,得到需要在显示屏幕上呈现的虚拟场景的信息,进而根据得到的需要在显示屏幕上呈现的虚拟场景的信息,得知当前应用程序的运行状态,以进行相应的处理。在一种可能的实现方式中,该相应的处理例如是除应用程序本身的运行逻辑之外的额外的处理,例如设置外挂,开发基于第三方应用逻辑的上层应用等。 根据本申请实施例,所述相应的处理可以是使能实体按键生效,在虚拟场景中用实体按键操作替代应用程序原来支持的触摸屏的触摸操作。
根据本申请实施例,应用程序可以通过渲染函数对虚拟场景进行渲染。应用程序在渲染一个虚拟场景时,可能进行多次渲染操作,多次调用渲染函数,模拟器会检测到对渲染函数的每次调用,获取渲染函数每次调用时关联的渲染数据,即模拟器可得到渲染虚拟场景所使用的渲染数据。
例如,当手机游戏应用程序在计算设备上通过模拟器模拟的虚拟安卓操作系统中运行时,需要根据其游戏逻辑在显示屏幕上呈现游戏场景,游戏场景例如为游戏帧。模拟器可以检测渲染游戏帧过程中应用程序进行每次渲染操作时对渲染函数的调用,进而获得渲染函数使用的各种渲染数据。在一个游戏帧的渲染结束后会有一个刷新操作,开始渲染下一个游戏帧。模拟器可以通过检测所述刷新操作,判断渲染一个游戏帧过程中每一次对渲染函数的调用。
步骤203,计算设备识别所述渲染数据中是否包括预设的贴图图片。
根据本申请实施例,模拟器识别渲染数据中预设的贴图图片。在一种可能的实现方式中,预设的贴图图片例如是需要依据它来确定应用程序当前的运行状态、识别当前的虚拟场景,以对应用程序进行除其本身的运行逻辑之外的额外的处理的贴图图片。模拟器在检测到渲染所述虚拟场景所用的渲染数据中包括预设的贴图图片时,可以认为该贴图图片生效。
根据本申请实施例,在应用程序为手机游戏应用程序的情况下,由于游戏帧中可能包括很多贴图图片,在这些贴图图片中,有的贴图图片是可由游戏玩家操作的虚拟操作按钮的贴图图片,用于当手机游戏应用程序运行在手机上时,游戏玩家可以通过用手指触摸显示在触摸屏上的虚拟操作按钮,进行相应的游戏操作;有的贴图图片不是虚拟操作按钮的贴图图片,不能由游戏玩家操作,例如一些提示性图片。
因此,在设计将手机游戏应用程序运行在计算设备上时,模拟器设计者可以针对手机游戏中虚拟操作按钮的贴图图片,预先设置对应的实体按键,以实现用按键功能模拟触摸屏情况下的触摸功能。针对其他贴图图片,则可以不设置实体按键。模拟器设计者可以将需要设置对应的实体按键的贴图图片配置为一个配置文件,称为贴图图片配置文件。在手机游戏应用程序的运行过程中,模拟器识别渲染数据中是否有贴图图片配置文件中预设的贴图图片。
在一种可能的实现方式中,贴图图片配置文件中还可以包括需要检测的贴图图片所在的贴图的循环冗余校验码(CRC,Cyclic Redundancy Check,)、贴图图片在其所在贴图中的位置坐标(Coordinate)、贴图图片的标识(TextureId)、高度(height)、宽度(width)、备注(comm)等信息。作为一个示例,贴图图片的标识例如可以用数字表示。贴图图片通常为矩形,贴图图片在其所在的贴图中的位置坐标可以用贴图图片的四个角在该贴图中的位置坐标表示。
示例性地,贴图图片配置文件例如包括以下内容:
Figure PCTCN2019109985-appb-000001
Figure PCTCN2019109985-appb-000002
该示例的贴图图片配置文件配置了需要模拟器在应用程序运行的过程中,检测所渲染的虚拟场景中循环校验码为1576852035的贴图中位置为{1061945344,1038876672,1061945344,1045823488,1063518208,1038876672,1063518208,1045823488}的贴图图片。
手机游戏开发者在开发手机游戏时,通常会将手机游戏所需要的一些小的图片放在一个大图里。在本申请实施例中,这个大图称为贴图,小的图片称为贴图图片。手机游戏中可能有多个贴图,每个贴图有其对应的标识,贴图中的各个贴图图片在该贴图中的位置是固定的。贴图可以是图片的格式。贴图标识例如可以为贴图的名称,而贴图的名称可以是一串数字或者字符串。
图3A为根据本申请实施例提供的示例的手机游戏中的贴图和贴图图片关系示意图。如图3A所示,贴图30为绝地求生(PUBG,Player Unknown’s Battle Grounds)游戏中的贴图。在贴图30中,有各种小的贴图图片,例如,贴图图片31、32、33等等。
图3B为根据本申请实施例提供的示例的手机游戏中的贴图图片在贴图中的位置信息示意图。如图3B所示,在贴图34中有五个贴图图片,分别标记为35、36、37、38、39。贴图图片35在贴图34中的位置坐标可以表示为{(0.000000,0.462891),(0.000000,0.255859),(0.207031,0.255859),(0.207031,0.462891)}。其中,坐标(0.000000,0.462891)表示贴图图片35的左下角,(0.000000,0.255859)表示其左上角,坐标(0.207031,0.255859)表示其右上角,坐标(0.207031,0.462891)表示其右下角。这个图中所举的例子是OpenGL ES(OpenGL for Embedded System)中的贴图图片的示例。OpenGL ES是OpenGL的扩展,主要用于手机、掌上电脑(PDA,Personal Digital Assistant)和游戏主机等嵌入式设备。例如,对于图3B所示的贴图图片35,在贴图图片配置文件中配置该贴图图片35时,可以配置Coordinate的取值对应于贴图图片35的四个角的坐标值。
渲染函数在渲染游戏帧时,会将贴图的标识和贴图中贴图图片的位置坐标,作为贴图图片的数据来使用。当模拟器检测到应用程序根据游戏逻辑在显示屏幕上渲染游戏帧时,比较渲染函数使用的渲染数据中的贴图的标识和贴图中贴图图片的位置坐标,与贴图图片配置文件中预设的贴图的标识以及预设的贴图中的贴图图片的位置坐标,判断渲染数据及游戏帧中是否包括需要设置对应实体按键的预设的贴图图片。
步骤204,当识别到所述渲染数据中包括预设的贴图图片时,计算设备调用对应的键位配置文件,执行所述贴图图片与所述计算设备的对应实体按键的匹配处理。
根据本申请实施例,模拟器设计者可以将贴图图片和对应的实体按键的相关信息设置为配置文件,例如称为键位配置文件。计算设备中可以预先存储应用程序所需的键位配置文件。当在计算设备上运行应用程序的过程中,模拟器 在识别出虚拟场景中包括预设的贴图图片时,根据预设的贴图图片在所述键位配置文件中进行查找,得到预设的贴图图片对应的按键信息,在虚拟场景中设置与实体按键对应的按键图标,并使能实体按键生效,可以操作。键位配置文件可以是从服务器上下载到计算设备上,并且可以从服务器下载更新。
键位配置文件中可以包括需要设置对应实体按键的贴图图片的信息,例如包括贴图图片的标识(TextureId)、贴图图片的描述或名称(name)(例如描述贴图图片所对应的功能)、贴图图片的备注(Comm)。
键位配置文件中还包括贴图图片对应的实体按键的相关信息,例如实体按键的名称(Itemname)、实体按键对应的屏幕位置坐标(Point_x,Point_y)、实体按键的功能描述(Description)。所述实体按键的功能例如可以为与其对应的贴图图片所关联的游戏中虚拟操作按钮的功能。除此之外,键位配置文件还可以配置在检测到渲染贴图图片时实体按键生效,在显示屏幕上显示对应的按键图标,处于用户可操作状态。
同一个贴图图片,在键位配置文件中的标识和在贴图图片配置文件中的标识相同或对应,使得键位配置文件和贴图图片配置文件中相同的贴图图片的相关信息可以关联起来。
根据本申请实施例,在应用程序为手机游戏应用程序的情况下,模拟器可以在确定游戏场景中包括贴图图片配置文件中的预设的贴图图片时,根据所述键位配置文件查找到预设的贴图图片所对应的按键信息,使能对应的实体按键生效。
例如,对于绝地求生游戏,键位配置文件中可以预先设置捡物贴图图片对应键盘上的F键、G键、H键等,查看周围环境的贴图图片对应Alt键,瞄准贴图图片对应右键,跳跃贴图图片对应空格键,蹲下贴图图片对应C键,趴下贴图图片对应Z键,装弹贴图图片对应R键,方向控制贴图图片分别对应W键(向前)、S键(向后)、A键(向左)、D键(向右),身体左倾贴图图片对应Q键,身体右倾贴图图片对应E键,开/关麦克风贴图图片对应Y键,开/关声音贴图图片对应T键,武器切换贴图图片对应数字键1、2、3等,背包对应Table键,压车头贴图图片对应Q键,抬车头贴图图片对应E键,下车贴图图片对应F键,鸣笛贴图图片对应鼠标左键,左手射击贴图图片对应鼠标右键,等等。另外,键位配置文件中还包括这些实体按键对应的屏幕位置坐标。
计算设备在配置键位配置文件时,可以配置一键多用,使得在不同的游戏场景,相同的实体按键可以有不同的操作功能。
步骤205,计算设备在显示屏幕上呈现所述虚拟场景及与该虚拟场景对应的按键配置信息。
根据本申请实施例,所述按键配置信息可以是按键的相关信息,例如实体按键的按键图标。
由于显示屏幕是计算设备的,其是第一操作系统下的硬件设备,因此,在呈现所述虚拟场景及所述按键配置信息时,可以是在第一操作系统下,在显示屏幕上呈现所述虚拟场景及所述按键配置信息。更进一步地,模拟器在第一操作系统下,在显示屏幕上呈现所述虚拟场景,并在所述实体按键对应的屏幕位 置坐标处,显示设置的实体按键的按键图标。
在一种可能的实现方式中,模拟器在呈现所述虚拟场景时,可以将所述虚拟场景在虚拟操作系统下的坐标转换为在第一操作系统下的坐标,再在显示屏幕上呈现虚拟场景。例如将虚拟场景的在安卓操作系统下的坐标转换为在Window s操作系统下的坐标。
根据本申请实施例,可以将实体按键的名称设置在一个背景图片上,形成所述按键图标。所述背景图片可以为圆形、椭圆形等,并将背景图片的中心位置设置在实体按键对应的贴图图片的显示区域之内,使得二者有重叠区域。根据本申请实施例,还可以将实体按键的按键图标设置在预设的位置上。在设置按键图标时,可以尽量避免遮挡住贴图图片的主体图像。背景图片可以设置为具有一定的透明度。
图4A为根据本申请实施例提供的一个示例游戏场景的示意图,其中是绝地求生游戏的一个游戏帧。模拟器在该游戏帧中,设置背包贴图图片400对应Table键,捡物贴图图片402、403、404对应F键、G键、H键,查看周围环境贴图图片405对应Alt键,瞄准贴图图片406对应鼠标右键,跳跃贴图图片407对应空格键,与蹲下贴图图片408对应C键,趴下贴图图片409对应Z键,装弹贴图图片410对应R键,左手射击贴图图片411对应鼠标左键,方向控制贴图图片412对应W键、S键、A键、D键,身体左倾贴图图片413对应Q键,身体右倾贴图图片414对应E键,开/关麦克风贴图图片415对应Y键,开/关声音贴图图片416对应T键,武器切换贴图图片417、418、419对应数字键1、2、3。
图4B为根据本申请实施例提供的另一个示例游戏场景的示意图。如图4B所示,其中,模拟器在该游戏帧中,设置下车贴图图片420、压车头贴图图片421、抬车头贴图图片422、鸣笛贴图图片423对应,分别对应的按键为:F键、Q键、E键、鼠标左键等。另外,设置开/关麦克风贴图图片425、开/关声音贴图图片426、游戏设置贴图图片427对应的按键分别为:Y键、T键、Esc键。设置查看周围环境贴图图片424对应Alt键,方向控制贴图图片428对应W键、S键、A键、D键,背包贴图图片429对应Table键。
上述按键中未明确指出为鼠标键的为键盘上的按键。
根据本申请实施例,通过在虚拟操作系统中启动应用程序,对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所用的渲染数据,识别所述渲染数据中预设的贴图图片,当识别到所述虚拟场景中有预设的贴图图片时,调用对应的键位配置文件,以完成所述贴图图片与所述计算设备的对应实体按键的匹配处理,在所述计算设备的屏幕上呈现所述虚拟场景及与该虚拟场景对应的按键配置信息,本申请实施例能够基于对游戏中贴图图片的识别,快速高效地识别游戏场景,实时动态地在识别到的游戏场景中自动合理布局游戏操作按键,使得游戏操作按键设置智能化,游戏玩家无需在游戏场景中自行设置游戏操作按键,仅需根据计算设备自动设置的游戏操作按键,通过操作键盘和鼠标上对应的实体按键,便可实现对手机游戏进行操作,该种基于游戏场景识别的按键设置方式能够让游戏玩家有较好的游戏操作体验,较为智能化,效果较佳。
图5为根据本申请实施例提供的对应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所使用的渲染数据的流程示意图。图5可以是对图2中的步骤202进一步的描述,也可以是对其他实施例中相同或类似步骤的进一步描述。如图5所示,步骤202可以包括:
步骤501,计算设备通过检测对所述待呈现的虚拟场景进行渲染的渲染函数的调用,得到所述渲染函数被调用时关联的顶点缓存对象中的渲染数据。
应用程序在渲染虚拟场景时,可以使用OpenGL(Open Graphics Library)中的函数,例如glDrawElements函数。
OpenGL定义了跨编程语言、跨平台的应用程序编程(API,Application Programming Interface)规范。应用程序的图形渲染需求最后都交由实现了OpenGL协议的显卡驱动等渲染相关程序处理。在OpenGL进行图形渲染的过程中,可能需要经过顶点变换、图元装配、纹理映射、光栅化等一系列的计算机图形学操作。
glDrawElements函数是一个图元渲染函数,从数组中获得数据绘制或渲染图元。在一种可能的实现方式中,图元例如为三角形、线、点等。在本申请实施例中,以渲染的图元为三角形为例进行说明。作为一个示例,在该glDrawElements函数中的函数参数例如包括:图元类型(mode),用于说明要绘制或渲染的图元的类型;计数值(Count),表示以图元类型连接的顶点的总数,根据不同的mode,Count小于或等于单个mode类型图元的顶点数*图元数;索引值的类型(type),为下列值中的一个:GL_UNSIGNED_BYTE、GL_UNSIGNED_SHORT、GL_UNSIGNED_INT;以及指针(indices),指向索引存贮位置的指针。当调用glDrawElements函数时,该函数会通过索引使用Count表示的数目的一系列的元素,来创建一系列的图元。
OpenGL中包括各种库函数,例如基本库函数、实用库函数、辅助库函数、实用工具库函数等等。在OpenGL中提供有对顶点缓存对象(VBO,Vertex Buffer Object)进行操作的API函数。VBO是OpenGL在显卡存储空间中创建的一块内存缓存区,用于存储顶点的各类属性信息。其中,前述属性信息例如可以包括顶点颜色数据、顶点坐标、纹理坐标、顶点法向量,等等。顶点是计算机图形学中的一个基本概念。在计算机图形学中,二维或三维的物体例如可以是通过三角形(图元)绘制的。每个三角形有三个顶点,每个顶点都有一个3D位置。一个三角形的顶点的3D位置可以用一个数组来定义。由于OpenGL是在3D空间中工作的,而渲染的是一个2D三角形,因此,可以将三角形的顶点的z坐标设置为0.0。顶点坐标作为输入,输入到顶点着色器中,顶点着色器会在图形处理器(GPU,Graphics Processing Unit)上创建内存存储顶点坐标,并配置OpenGL如何解释这些内存,并指定如何将顶点坐标发送给显卡。顶点坐标是三维坐标,而显示屏幕是二维的,对顶点的顶点坐标进行计算便可以得到顶点在显示屏幕上的最终显示位置。
在进行渲染时,渲染函数例如上述的glDrawElements函数可以从VBO中读取出顶点的各类属性信息进行使用。
根据本申请实施例,模拟器可以通过检测对待呈现的虚拟场景进行渲染的glDrawElements函数的调用,得到该渲染函数被调用时关联的顶点缓存对象VBO中的渲染数据。
根据本申请实施例,通过检测对所述待呈现的虚拟场景进行渲染的渲染函数的调用,得到所述渲染函数被调用时关联的顶点缓存对象中的渲染数据,可以实时获得应用程序当前的运行状态,以进行相应的处理。
图6为根据本申请实施例提供的识别渲染数据中预设的贴图图片的流程示意图。图6可以是对图2中的步骤203进一步的描述,也可以是对其他实施例中相同或类似步骤的进一步描述。
在OpenGL中,对图形渲染时的纹理映射操作又称为纹理贴图操作,是将贴图图片作为纹理贴到所要渲染的物体的表面上来增强真实感。纹理映射操作是以三角形为单位进行的,需要指定物体表面上的三角形的每一个顶点在贴图中所对应的位置,再进行贴图。根据本申请实施例,所要渲染的物体为游戏中的对象。作为一个示例,纹理映射操作例如可以是通过glDrawElements函数进行的。
根据本申请实施例,模拟器在检测应用程序渲染游戏帧时,+可以通过检测应用程序对进行纹理映射的glDrawElements函数的调用实现。模拟器通过分析该函数使用的VBO中的顶点数据,得到所要渲染的顶点对应的贴图的标识。模拟器可以根据该贴图的标识,识别出渲染所述游戏帧所用的贴图。另外,模拟器再从VBO中取得所述顶点对应的所述贴图中的贴图图片的纹理坐标。通过贴图图片的纹理坐标便可以定位到贴图中相应贴图图片。
步骤203可以包括:
步骤601,计算设备判断所述渲染数据中是否包括所述预设的贴图图片所在的预设贴图。
根据本申请实施例,模拟器可以通过比较VBO中的渲染数据包括的贴图的标识和记录的预设贴图的标识,判断渲染数据中是否包括预设的贴图图片所在的预设贴图。在渲染数据中的贴图的标识和所述记录的预设贴图的标识匹配时,模拟器确定所述渲染数据中包括所述预设的贴图图片所在的所述预设贴图的标识,即判断出当前渲染的虚拟场景中有需要进一步处理的贴图图片,例如,设置对应的实体按键。
图7是对图6中的步骤601、602的具体描述。如图7所示,步骤601可以包括:
步骤701:计算设备比较所述渲染数据中的贴图的标识和记录的所述预设贴图的标识;
步骤702:计算设备在所述渲染数据中的贴图的标识和记录的所述预设贴图的标识匹配时,确定所述渲染数据中包括所述预设的贴图图片所在的所述预设贴图。
计算设备在通过模拟器运行应用程序的过程中,会先将应用程序中可能会用到的所有贴图从硬盘加载到内存。因此,当模拟器启动应用程序之后,会检测虚拟操作系统将应用程序数据从硬盘加载到显卡内存。应用程序数据中包括应用程序所用的贴图。根据本申请实施例,贴图加载进内存时会调用 glTexImage2D函数。
通过检测glTexImage2D函数,模拟器可以取得各个贴图的循环冗余校验码。模拟器校验所取得的贴图的循环冗余校验码,并比较它和贴图图片配置文件中的预设贴图的循环冗余校验码,以判断应用程序数据中的贴图的循环冗余校验码是否和预设贴图的循环冗余校验码匹配。当模拟器确定应用程序数据中的贴图的循环冗余校验码和贴图图片配置文件中的预设贴图的循环冗余校验码匹配时,记录应用程序数据中的所述贴图的标识(ID),作为所述预设贴图的标识。作为一个示例,贴图的标识例如可以为一个数字字符串,本申请实施例对此不进行具体限定。
对于一个贴图,应用程序每次启动时,可能会给贴图分配不同的标识。而贴图的循环冗余校验码是不变的。由于应用程序在使用贴图时,使用的是贴图的标识,所以需要在贴图图片配置文件中配置要检测的贴图的循环冗余校验码,根据贴图的循环冗余校验码确定应用程序数据中的贴图的标识,以便检测贴图是否被应用程序使用。
图8为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配方法的进一步流程示意图。如图8所示,在步骤201的启动应用程序之后,所述方法可以包括:
步骤801,计算设备检测虚拟操作系统将应用程序数据从硬盘加载到显卡内存;
步骤802,计算设备校验所述应用程序数据中的贴图的循环冗余校验码,并判断所述应用程序数据中的贴图的循环冗余校验码是否和所述预设贴图的循环冗余校验码匹配;
步骤803,计算设备在确定所述应用程序数据中的贴图的循环冗余校验码和所述预设贴图的循环冗余校验码匹配时,记录所述应用程序数据中的所述贴图的标识,作为所述预设贴图的标识。
根据本申请实施例,模拟器在检测到虚拟操作系统通过glTexImage2D函数将应用程序数据从硬盘加载到显卡内存时,会校验应用程序中的贴图的循环冗余校验码,比较应用程序中的贴图的循环冗余校验码和贴图图片配置文件中的预设贴图的循环冗余校验码,在二者匹配时,记录应用程序数据中的贴图的标识,作为所述预设贴图的标识。
步骤602,计算设备在确定所述渲染数据中包括所述预设的贴图图片所在的预设贴图时,判断所述渲染数据中是否包括所述预设的贴图图片在所述预设贴图中的位置信息。
在该步骤中,如图7、图8所示,判断所述渲染数据中是否包括所述预设的贴图图片在所述预设贴图中的位置信息包括:
步骤703:计算设备比较所述渲染数据中的贴图图片的纹理坐标和所述预设的贴图图片在所述预设贴图中的纹理坐标;
步骤704:计算设备在所述渲染数据中的贴图图片的纹理坐标和所述预设的贴图图片在所述预设贴图中的纹理坐标匹配时,确定所述渲染数据中包括所述 预设的贴图图片在所述预设贴图中的位置信息。
在一种可能的实现方式中,模拟器可以比较渲染数据中所渲染顶点对应的贴图图片的纹理坐标和贴图图片配置文件中配置的预设的贴图图片的纹理坐标,在二者匹配时,确定渲染数据中包括预设的贴图图片在所述预设贴图中的位置信息。
作为一个示例,在渲染函数glDrawElements函数所使用的VBO中,会包括该渲染函数所使用的贴图图片的纹理坐标,即贴图图片在其所在贴图中的位置坐标。根据本申请实施例,贴图或纹理图像的坐标范围例如可以是从(0,0)到(1,1)之间,这个坐标是经过归一化处理的坐标。例如,一个贴图的左上角坐标为(0,0),右下角坐标为(1,1)。贴图中的各贴图图片在该贴图中的位置可以用各贴图图片四个角在该贴图的坐标体系中的位置表示。参考图3B所示的贴图34及贴图图片35。
以图3B为例,根据本申请实施例,模拟器在识别游戏帧中的贴图图片时,可以分析glDrawElements函数所使用的VBO,查找到贴图34的标识,再查找到纹理坐标为{(0.000000,0.462891),(0.000000,0.255859),(0.207031,0.255859),(0.207031,0.462891)},则可以根据查找到的纹理坐标确认当前游戏帧中使用的是贴图34中的贴图图片35。
步骤603,计算设备在确定所述渲染数据中包括所述预设的贴图图片在所述预设贴图中的位置信息时,确定所述渲染函数使用的渲染数据中包括所述预设的贴图图片。
根据本申请实施例,模拟器在确定VBO中的渲染数据包括预设的贴图图片所在的预设贴图,以及预设的贴图图片在其所在的预设贴图中的纹理坐标时,确定渲染数据中有其要检测的、贴图图片配置文件中的预设的贴图图片。
根据本申请实施例的应用程序的虚拟场景识别与交互键位匹配方法,由于游戏中贴图图片在贴图中的纹理坐标是固定的,对于贴图里某一个贴图图片,每一次渲染所使用的该贴图图片的纹理坐标都是一样的,因此,通过检测渲染游戏帧时调用的渲染函数,查找渲染函数使用的渲染数据,判断渲染数据中是否包括预设的贴图图片所在的预设贴图,在确定渲染数据中包括所述预设的贴图时,再判断所述渲染数据中是否包括所述预设的贴图图片在所述预设贴图中的位置信息,在确定所述渲染数据中包括所述预设的贴图图片在所述预设贴图中的位置信息时,确定所述渲染函数使用的渲染数据中包括预设的贴图图片,只需要进行几个坐标数值的对比,便能精准地判断游戏状态,快速知道某一个贴图图片是否要被游戏显示,所处理的字节数比较少,使得计算设备渲染游戏帧的速度更快,所需的计算资源开销更少,性能更高。
图9为根据本申请实施例提供的当识别到所述虚拟场景中包括预设的贴图图片时,调用对应的键位配置文件,以完成所述贴图图片与所述计算设备的对应实体按键的匹配处理的流程示意图。图9是以对图2中的步骤204进一步的描述为例,同时图9也适用于前述其他各个实施例中相同或类似步骤的描述。
如图9所示,步骤204可以包括:
步骤901,当识别到所述虚拟场景中包括预设的贴图图片时,计算设备调用对应的键位配置文件,根据键位配置文件中所述预设的贴图图片和对应的按键信息,确定所述预设的贴图图片对应的实体按键,使能该实体按键生效。
根据本申请实施例,模拟器可以在通过比较贴图图片配置文件中的贴图图片和渲染虚拟场景的渲染函数所使用的渲染数据中的贴图图片,确定出当前虚拟场景中有需要设置对应的实体按键的预设的贴图图片之后,在键位配置文件中查找该预设的贴图图片对应的按键信息。例如,模拟器可以比较从所述贴图图片配置文件中得到的所述预设的贴图图片的标识与键位配置文件中的贴图图片的标识,在二者匹配后,根据键位配置文件确认贴图图片的标识所对应的实体按键的名称、实体按键对应的屏幕位置坐标。
例如,示例的键位配置文件如下:
<Switch Name=“GetIntoCar”TextureId=“30”Comm=“上车驾驶”/>
……
<KeyMapping ItemName=“F”Point_X=“0.656250”Point_Y=“0.419444”Description=“上车”>
<SwitchOperation Description=“上车”EnableSwitch=“GetIntoCar”/>
</KeyMapping>
其中,键位配置文件配置了贴图图片的标识(TextureId)为“30”,名称(Name)为“GetIntoCar”,备注(Comm)为“上车驾驶”。同时键位配置文件配置了在匹配到键位配置文件中的贴图图片“GetIntoCar”时,得到按键“F”对应的屏幕位置坐标(0.656250,0.419444)。
模拟器根据前述示例的贴图图片配置文件的标识,查找到TextureId为“30”的贴图图片对应的名称Itemname“GetIntoCar”,再通过SwitchOperation语句中的EnableSwitch,使能GetIntoCar的贴图图片生效。再根据“GetIntoCar”查找到其对应的按键为“F”、按键F对应的屏幕位置的横坐标0.656250、纵坐标0.419444,即键盘上的按键F会生效。作为一个示例,实体按键对应的屏幕位置坐标例如可以为实体按键的按键图标的中心在显示屏幕上的位置。
在键位配置文件中,可以通过多条SwitchOperation语句配置同一个实体按键在不同贴图图片生效时生效。作为一个示例,排列在最前的贴图图片的优先级最高。SwitchOperation语句中也可以携带实体按键对应的屏幕位置坐标。这种情况下,某个贴图图片生效后,其他贴图图片便无效,实体按键对应的屏幕位置坐标仅能是生效的那个贴图图片所在的SwitchOperation语句中的坐标。这样配置更为灵活。
步骤902,确定所述预设的贴图图片对应的实体按键在显示屏幕上的操作位置。
在一种可能的实现方式中,模拟器例如可以是根据所述键位配置文件中的实体按键对应的屏幕位置坐标,确定对应的实体按键在显示屏幕上的操作位置;或者是根据所述预设的贴图图片在显示屏幕上的显示位置,确定所述对应的实体按键在显示屏幕上的操作位置。
对于根据键位配置文件中的实体按键对应的屏幕位置坐标,确定对应的实 体按键在屏幕上的操作位置,这种确定方法简单且具有通用性,在不方便根据预设的贴图图片在显示屏幕上的显示位置,确定实体按键在显示屏幕上的操作位置的情况下,也可以适用。
例如,在绝地求生的游戏中,假设游戏中代表游戏玩家的人物死亡时,游戏场景中的某个位置会有一个正方形的图,模拟器检测到这个图的贴图图片时,会得知游戏结束,需要用户确认是否退出,但这个图所在的位置处是不能进行操作的,需要在退出按钮的位置进行操作。因此,这种情况下,会预先在键位配置文件中配置执行退出操作的实体按键在显示屏幕上的操作位置。在游戏的运行过程中,模拟器如果检测到游戏场景中包括所述正方形的贴图图片时,会在退出按钮的位置处设置相应的实体按键的按键图标,供用户进行退出操作。
针对根据预设的贴图图片在显示屏幕上的显示位置,确定对应的实体按键在显示屏幕上的操作位置的情况,可以包括以下步骤:
查找所述渲染函数使用的渲染数据中的所述预设的贴图图片所要渲染的顶点坐标;
根据所述顶点坐标,计算出所述预设的贴图图片在显示屏幕上的显示区域;
根据所述预设的贴图图片在显示屏幕上的显示区域,计算出所述对应的实体按键在显示屏幕上的操作位置。
根据本申请实施例,例如,模拟器在检测到应用程序根据游戏逻辑渲染游戏帧时,可以从渲染函数glDrawElements所使用的VBO中,查找到贴图图片所渲染的游戏帧中的三角形的顶点坐标。其中,这个顶点坐标例如可以是组成各种操作按钮的三角形的顶点坐标。作为一个示例,这个顶点坐标例如可以是基于第二操作系统的屏幕位置,例如为安卓操作系统下的顶点在显示屏幕上的位置。作为一个示例,安卓操作系统下,显示屏幕上的最左上角为坐标原点(0,0),最右下角为坐标最大值。举例来说,如在安卓操作系统下屏幕分辨率为1024*576,则这个坐标最大值为(1024,576)。
例如,在用正方形的贴图图片渲染游戏帧中的三角形时,需要两个三角形,每个三角形有三个顶点,例如第一个三角形中包括顶点1、2、3,第二个三角形中包括顶点4、5、6。在这两个三角形中,有两个顶点是重合的。计算设备可以获得这两个三角形的各个顶点的顶点坐标。
之后,模拟器可以根据游戏帧中的三角形的顶点坐标,计算出贴图图片在显示屏幕上的显示位置。例如,模拟器可以根据游戏帧中的三角形的顶点坐标,通过模型变换(Model)、视图变换(View)、投影变换(Projection)后,再通过视口变换计算出第一操作系统下贴图图片在显示屏幕上的实际显示位置和显示区域。其中,上述模型变换、试图变换和投影变换简称MVP变换。另外,模拟器可以在获取到游戏帧中的三角形的顶点坐标之后,根据游戏帧中的三角形的顶点坐标,计算出游戏帧中的三角形上所贴的贴图图片在第一操作系统下的显示屏幕上的显示位置。
根据本申请实施例,例如,模拟器可以先根据贴图图片在第一操作系统下的显示屏幕上的显示位置,确定贴图图片在显示屏幕上的显示区域,再根据贴图图片在显示屏幕上的显示区域,确定贴图图片对应的实体按键在显示屏幕上 的操作位置位于该显示区域内。
根据本申请实施例的应用程序的虚拟场景识别与交互键位匹配方法,通过在键位配置文件中查找与预设的贴图图片对应的按键信息,确定对应的实体按键,并确定实体按键在显示屏幕上的操作位置,以设置实体按键,由于计算设备可以实时、动态地设置相应的实体按键,并使之生效,使得游戏玩家可以实时操作游戏,有更好的游戏操作体验。
图10为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配方法的进一步流程示意图。如图10所示,在图2所示的各步骤之外,本申请的应用程序的虚拟场景识别与交互键位匹配方法进一步包括:
步骤1001,检测在第一操作系统下通过输入输出设备输入的对所述预设的贴图图片对应的实体按键的操作消息。
根据本申请实施例,模拟器会检测用户在计算设备的第一操作系统下,通过键盘和鼠标之类的输入输出设备输入的操作消息,该操作消息是用于对显示屏幕上显示的虚拟场景中的按键图标进行操作的操作消息。
在游戏的情况下,模拟器会通过Hook Windows的消息,确定游戏玩家按下了键盘或鼠标上的某个实体按键,之后模拟器再查看键位配置文件中,这个实体按键是否生效。在确定游戏玩家按下了键盘或鼠标上对应的实体按键时,判断该操作是有效的。其中,前述对应的实体按键与显示屏幕上显示的游戏场景中的按键图标对应。
步骤1002,将所述预设的贴图图片对应的实体按键在显示屏幕上的操作位置写入触摸屏设备文件中,由触摸屏驱动程序从触摸屏设备文件中读取该操作位置,并将该操作位置送入所述应用程序中,由所述应用程序在该操作位置进行与所述虚拟场景的交互操作。
如前所述,模拟器可以通过键位配置文件,得到所要设置的实体按键在显示屏幕上的操作位置,或者根据贴图图片的位置信息得到要设置的实体按键在显示屏幕上的操作位置。之后,当模拟器检测到游戏玩家触发对显示屏幕上所显示的按键图标的操作时,会将所述实体按键对应的操作数据写入安卓操作系统的触摸屏设备文件中。其中,所述实体按键对应的操作数据例如可以是实体按键在显示屏幕上的操作位置,本申请实施例对此不进行具体限定。在写入之前,可以将实体按键在显示屏幕上的操作位置进行坐标转换,转换为对应的安卓值,即转换为在虚拟安卓操作系统下的操作位置,再写入触摸屏设备文件中。
手机上的安卓操作系统通常是基于Linux内核开发的,可以分成内核层和应用层。应用层可理解为应用程序(App)的运行环境,其中,应用程序可为游戏;内核层为应用层提供基础服务。在内核层植入有针对于各类硬件的驱动程序,称为硬件驱动程序。硬件驱动程序通过读写安卓操作系统的设备文件,便可以知道硬件的数据。硬件驱动程序将取得的硬件的数据发送到上层应用中,运行应用程序。在安卓操作系统中,有各种硬件驱动程序及对应的设备文件,例如有触摸屏驱动程序、蓝牙驱动程序、音响驱动程序等等。而相应地,有和硬件驱动程序对应的触摸屏设备文件、蓝牙设备文件、音响设备文件等等。
以硬件为手机上的触模屏为例,当触摸屏被触模时,会将因触摸产生的屏幕位置坐标(x,y)作为触摸数据,传送到触模屏设备文件中。内核层的触模屏驱动程序读取该触摸屏设备文件,获得触摸屏被触模的触摸数据,然后会将触摸数据传到上层应用中。安卓操作系统的上层应用,例如游戏,再根据触摸数据响应触摸屏被触摸的消息。
本申请实施例根据上述安卓操作系统的处理过程,在以PC模拟器模拟安卓手机游戏时,PC模拟器在待渲染游戏帧时,通过贴图图片的位置信息计算得到键盘或鼠标的实体按键对应的屏幕位置坐标后,向触模屏设备文件写入实体按键对应的屏幕位置坐标作为触摸数据,便可以模拟安卓的触模消息。即PC模拟器模拟触摸屏硬件写触摸屏设备文件的过程。
因此,根据本申请实施例,游戏玩家在玩游戏的过程中按下键盘上的实体按键,当计算设备检测到该实体按键与显示屏幕上显示的游戏帧中的某个按键图标匹配时,计算设备可以认为键盘上的实体按键的按下是有效的操作。进而,计算设备可以查找到与显示屏幕上实体按键对应的贴图图片所在的屏幕位置坐标(x,y),再将这个屏幕位置坐标写进触摸屏设备文件。
例如,根据前述示例的键位配置文件,计算设备会将F键对应的屏幕位置坐标X=0.656250,Y=0.419444作为触摸数据写入触摸屏文件中。
根据本申请实施例,计算设备调用触摸屏驱动程序读取触摸屏设备文件中写入的操作数据,例如,屏幕位置坐标,将这个屏幕位置坐标上传到上层应用中,便产生显示屏幕被按下消息,模拟触摸屏情况下点击显示屏幕的操作。
例如,根据前述示例的键位配置文件,计算设备调用触摸屏驱动程序读取触摸屏设备文件中的屏幕位置坐标X=0.656250,Y=0.419444后,可以计算出对应的在安卓操作系统下的屏幕位置坐标,通过触摸屏驱动程序发送到安卓操作系统中,便可以得知游戏玩家按下了上车键。
计算设备在得到触摸屏驱动程序读取的实体按键的屏幕位置坐标后,在该坐标点执行点击操作,以实现对与该实体按键对应的贴图图片所表示的游戏帧中的虚拟操作按钮的操作。之后,计算设备根据游戏逻辑,在显示屏幕上输出该虚拟操作按钮被点击后的结果。
例如,根据前述示例的键位配置文件,计算设备根据由屏幕位置坐标X=0.656250,Y=0.419444计算的在安卓操作系统下的屏幕位置坐标,在显示屏幕上点击相应的位置,使游戏中的人物上车。
根据本申请实施例的应用程序的虚拟场景识别与交互键位匹配方法,通过检测在第一操作系统下通过输入输出设备输入的对所述预设的贴图图片对应的实体按键在显示屏幕上的操作位置,将所述实体按键在显示屏幕上的操作位置写入触摸屏设备文件中,由触摸屏驱动程序从触摸屏设备文件中读取该写入的操作位置,并将该操作位置送入所述应用程序中,由所述应用程序在该操作位置进行与所述虚拟场景的交互操作,实现了用户使用键盘和鼠标操作游戏。
图11为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配装置示意图。如图11所示,根据本申请实施例提供的应用程序的虚拟场景识别 与交互键位匹配装置可以包括:
启动模块1101,用于在虚拟操作系统中启动应用程序;
渲染检测模块1102,用于对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所使用的渲染数据;
贴图图片识别模块1103,用于识别所述渲染数据中是否包括预设的贴图图片;
键位设置模块1104,用于当识别到所述渲染数据中包括预设的贴图图片时,调用对应的键位配置文件,执行所述预设的贴图图片与所述计算设备的对应实体按键的匹配处理;
显示模块1105,用于在所述计算设备的显示屏幕上呈现所述虚拟场景及与该虚拟场景对应的按键配置信息,所述按键配置信息与匹配到的实体按键关联。
根据本申请实施例,所述渲染检测模块1102用于通过检测所述应用程序对渲染函数的调用,得到所述渲染函数被调用时顶点缓存对象中的渲染数据,将所述顶点缓存对象中的渲染数据作为渲染所述虚拟场景所使用的渲染数据,所述渲染函数用于对所述虚拟场景进行渲染处理。
根据本申请实施例,通过在虚拟操作系统中启动应用程序,对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所用的渲染数据,识别所述渲染数据中预设的贴图图片,当识别到所述虚拟场景中有预设的贴图图片时,调用对应的键位配置文件,以完成所述贴图图片与所述计算设备的对应实体按键的匹配处理,在所述计算设备的屏幕上呈现所述虚拟场景及与该虚拟场景对应的按键配置信息,本申请实施例能够基于对手机游戏应用程序中贴图图片的识别,快速高效地识别游戏场景,实时动态地在识别到的游戏场景中自动合理布局游戏操作按键,使得游戏操作按键设置智能化,游戏玩家能很方便地使用鼠标和键盘体验游戏,不再需要自行设置大量游戏操作按键,使得游戏玩家能够有较好的游戏操作体验,因此该种基于游戏场景识别的按键设置方式较为智能化,效果较佳。
图12为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配装置中的贴图图片识别模块的示意图。如图12所示,贴图图片识别模块1103可以包括:
贴图判断模块1201,用于判断所述渲染数据中是否包括所述预设的贴图图片所在的预设贴图;
纹理坐标判断模块1202,用于在所述贴图判断模块1201确定所述渲染数据中包括所述预设的贴图图片所在的预设贴图时,判断所述渲染数据中是否包括所述预设的贴图图片在所述预设贴图中的位置信息;
贴图图片确认模块1203,用于在所述纹理坐标判断模块1202判断所述渲染数据中包括所述预设的贴图图片在所述预设贴图中的位置信息时,确定所述渲染函数使用的渲染数据中包括所述预设的贴图图片。
根据本申请实施例,如图13所示,贴图判断模块1201可以包括:贴图标识比较模块1301,用于比较所述渲染数据中的贴图的标识和记录的所述预设贴图 的标识;贴图确认模块1302,用于在所述贴图标识比较模块1301确定所述渲染数据中的贴图的标识和记录的所述预设贴图的标识匹配时,确定所述渲染数据中包括所述预设的贴图图片所在的所述预设贴图。
纹理坐标判断模块1202可以包括:纹理坐标比较模块1303,用于比较所述渲染数据中的贴图图片的纹理坐标和所述预设的贴图图片在所述预设贴图中的纹理坐标;纹理坐标确定模块1304,用于在所述纹理坐标比较模块1303确定所述渲染数据中的贴图图片的纹理坐标和所述预设的贴图图片在所述预设贴图中的纹理坐标匹配时,确定所述渲染数据中包括所述预设的贴图图片在所述预设贴图中的位置信息。
图14为根据本申请实施例提供的另一个应用程序的虚拟场景识别与交互键位匹配装置示意图。如图14所示,在图13的基础上,所述应用程序的虚拟场景识别与交互键位匹配装置进一步包括以下模块:
贴图加载检测模块1401,用于检测虚拟操作系统是否将应用程序数据从硬盘加载到显卡内存;
校验模块1402,用于在检测到所述虚拟操作系统将应用程序数据从硬盘加载到显卡内存后,校验所述应用程序数据中的贴图的循环冗余校验码,并判断所述应用程序数据中的贴图的循环冗余校验码是否和所述预设贴图的循环冗余校验码匹配;
记录模块1403,用于在所述校验模块确定所述应用程序数据中的贴图的循环冗余校验码和所述预设贴图的循环冗余校验码匹配时,记录所述应用程序数据中的所述贴图的标识,将所述应用程序数据中的所述贴图的标识,作为所述预设贴图的标识。
根据本申请实施例的应用程序的虚拟场景识别与交互键位匹配装置,由于游戏中贴图图片在贴图中的纹理坐标是固定的,对于贴图里某一贴图图片,每一次渲染所使用的该贴图图片的纹理坐标都是一样,因此,通过检测渲染游戏帧时调用的渲染函数,判断渲染数据中是否包括预设的贴图图片所在的预设贴图,在确定渲染数据中包括所述预设的贴图时,再判断所述渲染数据中是否包括所述预设的贴图图片在所述预设贴图中的位置信息,在确定所述渲染数据中包括所述预设的贴图图片在所述预设贴图中的位置信息时,确定所述渲染函数使用的渲染数据中包括预设的贴图图片,只需要进行几个坐标数值的对比,便能快速获知某一贴图图片是否需要被游戏显示,所处理的字节数比较少,使得计算设备渲染游戏帧的速度更快,所需的计算机资源开销更少,性能更高。
图15为根据本申请实施例提供的应用程序的虚拟场景识别与交互键位匹配装置中的按键确定模块的示意图。如图15所示,按键确定模块1104包括:
按键确定模块1501:用于当识别到所述渲染数据包括有预设的贴图图片时,调用对应的键位配置文件,根据键位配置文件中所述预设的贴图图片和对应的按键信息,确定所述预设的贴图图片对应的实体按键,使能该实体按键生效;
按键位置确定模块1502:用于确定所述预设的贴图图片对应的实体按键在显示屏幕上的操作位置。
根据本申请实施例的应用程序的虚拟场景识别与交互键位匹配装置,通过在键位配置文件中查找与预设的贴图图片对应的按键信息,确定对应的实体按键,并确定该实体按键在显示屏幕上的操作位置,即计算设备可以实时、动态地得设置相应的实体按键,并使之生效,使得游戏玩家可以实时操作游戏,有更好的游戏操作体验。
上述装置实施例中的各模块的操作可以参考前面方法描述的具体内容。
图16为根据本申请实施例提供的计算设备的硬件结构示意图。如图16所示,计算设备可因配置或性能不同而产生比较大的差异,可以包括一个或一个以上中央处理器(central processing units,CPU)1601(例如,一个或一个以上处理器)和存储器1602,一个或一个以上存储应用程序1603或数据1604的存储介质1605(例如一个或一个以上海量存储设备)。其中,存储器1602和存储介质1605可以是短暂存储或持久存储。存储在存储介质1605的程序可以包括一个或一个以上模块(图示没标出),每个模块可以包括对计算设备中的一系列指令操作。更进一步地,中央处理器1601可以设置为与存储介质1605通信,在计算设备上执行存储介质1605中的一系列指令操作。
具体地,在存储介质1605中储存的应用程序1603包括应用程序的虚拟场景识别与交互键位匹配装置的应用程序,且该程序可以包括上述应用程序的虚拟场景识别与交互键位匹配装置中的启动模块1101、检测模块1102、贴图图片识别模块1103、键位设置模块1104、显示模块1105、交互模块1106,在此不进行赘述。更进一步地,中央处理器1601可以设置为与存储介质1605通信,在计算设备上执行存储介质1605中储存的游戏操作的应用程序对应的一系列操作。所述游戏操作的应用程序也可以包括各方法流程图中描述的操作。
计算设备还可以包括一个或一个以上电源1606,一个或一个以上有线或无线网络接口1607,一个或一个以上输入输出接口1608,和/或,一个或一个以上操作系统1609,例如Windows ServerTM,Mac OS XTM,UnixTM,LinuxTM,FreeBSDTM等等。
本申请实施例提供了一种计算设备,该计算设备能够构建或运行虚拟操作系统,所述计算设备包括:处理器和存储器,所述存储器上存储有计算机可读指令,所述计算机可读指令由所述处理器执行以完成前述各个方法实施例所述的应用程序的虚拟场景识别与交互键位匹配方法。
根据本申请实施例的计算设备,根据本申请实施例,通过在虚拟操作系统中启动应用程序,对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所用的渲染数据,识别所述渲染数据中预设的贴图图片,当识别到所述虚拟场景中有预设的贴图图片时,调用对应的键位配置文件,以完成所述贴图图片与所述计算设备的对应实体按键的匹配处理,在所述计算设备的屏幕上呈现所述虚拟场景及与该虚拟场景对应的按键配置信息,本申请实施例能够基于对手机游戏应用程序中贴图图片的识别,快速高效地识别游戏场景,实时动态地在识别到的游戏场景中自动合理布局游戏操作按键,使得游戏操作按键设置智能化,游戏玩家能很方便地使用鼠标和键盘体验游戏,不再需要自 行设置大量游戏操作按键,使得游戏玩家能够有较好的游戏操作体验,因此该种基于游戏场景识别的按键设置方式较为智能化,效果较佳。
本申请是参照根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的,应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
另外,在本申请各个实施例中的各功能模块可以集成在一个处理单元中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
另外,本申请的每个实例可以通过由数据处理设备如计算机执行的数据处理程序来实现。显然,数据处理程序构成了本申请。此外,通常存储在一个存储介质中的数据处理程序通过直接将程序读取出存储介质或者通过将程序安装或复制到数据处理设备的存储设备(如硬盘和或内存)中执行。因此,这样的存储介质也构成了本申请。存储介质可以使用任何类型的记录方式,例如纸张存储介质(如纸带等)、磁存储介质(如软盘、硬盘、闪存等)、光存储介质(如CD-ROM等)、磁光存储介质(如MO等)等。
因此,本申请还提供了一种非易失性存储介质,其中存储有游戏操作程序,该游戏操作程序用于执行本申请上述实施例方法中的任何一种实例。
以上所述仅为本申请的实例而已,并不用以限制本申请,凡在本申请的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本申请保护的范围之内。

Claims (21)

  1. 一种应用程序的虚拟场景识别与交互键位匹配方法,其特征在于,应用于能够构建或运行虚拟操作系统的计算设备,所述应用程序的虚拟场景识别与交互键位匹配方法包括:
    在所述虚拟操作系统中启动应用程序;
    对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所使用的渲染数据;
    当识别到所述渲染数据中包括预设的贴图图片时,调用键位配置文件,执行所述预设的贴图图片与所述计算设备的对应实体按键的匹配处理;
    在所述计算设备的显示屏幕上呈现所述虚拟场景及与所述虚拟场景对应的按键配置信息,所述按键配置信息与匹配到的实体按键关联。
  2. 根据权利要求1的方法,其特征在于,所述对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所使用的渲染数据包括:
    通过检测所述应用程序对渲染函数的调用,得到所述渲染函数被调用时顶点缓存对象中的渲染数据,将所述顶点缓存对象中的渲染数据作为渲染所述虚拟场景所使用的渲染数据,所述渲染函数用于对所述虚拟场景进行渲染处理。
  3. 根据权利要求1的方法,其特征在于,所述方法还包括:
    当确定所述渲染数据中包括所述预设的贴图图片所在的预设贴图时,以及,
    当确定所述渲染数据中包括所述预设的贴图图片在所述预设贴图中的位置信息时,确定所述渲染数据中包括所述预设的贴图图片。
  4. 根据权利要求3的方法,其特征在于,所述方法还包括:
    比较所述渲染数据中的贴图的标识和记录的所述预设贴图的标识;
    在确定所述渲染数据中的贴图的标识和记录的所述预设贴图的标识匹配时,确定所述渲染数据中包括所述预设贴图。
  5. 根据权利要求3或4的方法,其特征在于,所述方法还包括:
    比较所述渲染数据中的贴图图片的纹理坐标和所述预设的贴图图片在所述预设贴图中的纹理坐标;
    在确定所述渲染数据中的贴图图片的纹理坐标和所述预设的贴图图片在所述预设贴图中的纹理坐标匹配时,确定所述渲染数据中包括所述预设的贴图图片在所述预设贴图中的位置信息。
  6. 根据权利要求4的方法,其特征在于,在所述虚拟操作系统中启动应用程序之后,所述方法还包括:
    在检测到所述虚拟操作系统将应用程序数据从硬盘加载到显卡内存后,当 确定所述应用程序数据中的贴图的循环冗余校验码和所述预设贴图的循环冗余校验码匹配时,将所述应用程序数据中的所述贴图的标识,作为所述预设贴图的标识。
  7. 根据权利要求1的方法,其特征在于,所述当识别到所述渲染数据中包括预设的贴图图片时,调用键位配置文件,执行所述预设的贴图图片与所述计算设备的对应实体按键的匹配处理包括:
    当识别到所述渲染数据中包括所述预设的贴图图片时,调用所述键位配置文件,根据所述键位配置文件中与所述预设的贴图图片对应的按键信息,确定所述预设的贴图图片对应的实体按键,使能所述实体按键生效;
    确定所述预设的贴图图片对应的实体按键在显示屏幕上的操作位置。
  8. 根据权利要求7的方法,其特征在于,所述确定所述预设的贴图图片对应的实体按键在显示屏幕上的操作位置包括:
    根据所述键位配置文件中与所述预设的贴图图片对应的按键信息,确定所述预设的贴图图片对应的实体按键在所述显示屏幕上的操作位置;或者
    根据所述预设的贴图图片在所述显示屏幕上的显示位置,确定所述预设的贴图图片对应的实体按键在所述显示屏幕上的操作位置。
  9. 根据权利要求8的方法,其特征在于,所述根据所述预设的贴图图片在所述显示屏幕上的显示位置,确定所述预设的贴图图片对应的实体按键在所述显示屏幕上的操作位置包括:
    查找所述渲染数据中的所述预设的贴图图片所要渲染的顶点坐标;
    根据所述顶点坐标,获取所述预设的贴图图片在所述显示屏幕上的显示区域;
    根据所述预设的贴图图片在所述显示屏幕上的显示区域,获取所述预设的贴图片对应的实体按键在所述显示屏幕上的操作位置。
  10. 根据权利要求1的方法,其特征在于,所述方法进一步包括:
    检测在第一操作系统下通过输入输出设备输入的对所述预设的贴图图片对应的实体按键的操作消息;
    将所述预设的贴图图片对应的实体按键在所述显示屏幕上的操作位置写入触摸屏设备文件中,由触摸屏驱动程序从所述触摸屏设备文件中读取所述操作位置,并将所述操作位置送入所述应用程序中,由所述应用程序在所述显示屏幕上的所述操作位置进行与所述虚拟场景的交互操作。
  11. 一种计算设备,其特征在于,所述计算设备能够构建或运行虚拟操作系统,所述计算设备包括:处理器和存储器,所述存储器上存储有计算机可读指令,所述计算机可读指令由所述处理器执行以完成以下操作:
    在所述虚拟操作系统中启动应用程序;
    对所述应用程序的待呈现的虚拟场景进行渲染处理,得到渲染所述虚拟场景所使用的渲染数据;
    当识别到所述渲染数据中包括预设的贴图图片时,调用键位配置文件,执行所述预设的贴图图片与所述计算设备的对应实体按键的匹配处理;
    在所述计算设备的显示屏幕上呈现所述虚拟场景及与所述虚拟场景对应的按键配置信息,所述按键配置信息与匹配到的实体按键关联。
  12. 根据权利要求11的计算设备,其特征在于,所述计算机可读指令由所述处理器执行以完成以下操作:
    通过检测所述应用程序对渲染函数的调用,得到所述渲染函数被调用时顶点缓存对象中的渲染数据,将所述顶点缓存对象中的渲染数据作为渲染所述虚拟场景所使用的渲染数据,所述渲染函数用于对所述虚拟场景进行渲染处理。
  13. 根据权利要求11的计算设备,其特征在于,所述计算机可读指令由所述处理器执行以完成以下操作:
    当确定所述渲染数据中包括所述预设的贴图图片所在的预设贴图时,以及,
    当确定所述渲染数据中包括所述预设的贴图图片在所述预设贴图中的位置信息时,确定所述渲染数据中包括所述预设的贴图图片。
  14. 根据权利要求13的计算设备,其特征在于,所述计算机可读指令由所述处理器执行以完成以下操作:
    比较所述渲染数据中的贴图的标识和记录的所述预设贴图的标识;
    在确定所述渲染数据中的贴图的标识和记录的所述预设贴图的标识匹配时,确定所述渲染数据中包括所述预设贴图。
  15. 根据权利要求13或14的计算设备,其特征在于,所述计算机可读指令由所述处理器执行以完成以下操作:
    比较所述渲染数据中的贴图图片的纹理坐标和所述预设的贴图图片在所述预设贴图中的纹理坐标;
    在确定所述渲染数据中的贴图图片的纹理坐标和所述预设的贴图图片在所述预设贴图中的纹理坐标匹配时,确定所述渲染数据中包括所述预设的贴图图片在所述预设贴图中的位置信息。
  16. 根据权利要求14的计算设备,其特征在于,在所述虚拟操作系统中启动应用程序之后,所述计算机可读指令由所述处理器执行以完成以下操作:
    在检测到所述虚拟操作系统将应用程序数据从硬盘加载到显卡内存后,当确定所述应用程序数据中的贴图的循环冗余校验码和所述预设贴图的循环冗余校验码匹配时,将所述应用程序数据中的所述贴图的标识,作为所述预设贴图 的标识。
  17. 根据权利要求11的计算设备,其特征在于,所述计算机可读指令由所述处理器执行以完成以下操作:
    当识别到所述渲染数据中包括所述预设的贴图图片时,调用所述键位配置文件,根据所述键位配置文件中与所述预设的贴图图片对应的按键信息,确定所述预设的贴图图片对应的实体按键,使能所述实体按键生效;
    确定所述预设的贴图图片对应的实体按键在显示屏幕上的操作位置。
  18. 根据权利要求17的计算设备,其特征在于,所述计算机可读指令由所述处理器执行以完成以下操作:
    根据所述键位配置文件中与所述预设的贴图图片对应的按键信息,确定所述预设的贴图图片对应的实体按键在所述显示屏幕上的操作位置;或者
    根据所述预设的贴图图片在所述显示屏幕上的显示位置,确定所述预设的贴图图片对应的实体按键在所述显示屏幕上的操作位置。
  19. 根据权利要求18的计算设备,其特征在于,所述计算机可读指令由所述处理器执行以完成以下操作:
    查找所述渲染数据中的所述预设的贴图图片所要渲染的顶点坐标;
    根据所述顶点坐标,获取所述预设的贴图图片在所述显示屏幕上的显示区域;
    根据所述预设的贴图图片在所述显示屏幕上的显示区域,获取所述预设的贴图片对应的实体按键在所述显示屏幕上的操作位置。
  20. 根据权利要求11的计算设备,其特征在于,所述计算机可读指令由所述处理器执行以完成以下操作:
    检测在第一操作系统下通过输入输出设备输入的对所述预设的贴图图片对应的实体按键的操作消息;
    将所述预设的贴图图片对应的实体按键在所述显示屏幕上的操作位置写入触摸屏设备文件中,由触摸屏驱动程序从所述触摸屏设备文件中读取所述操作位置,并将所述操作位置送入所述应用程序中,由所述应用程序在所述显示屏幕上的所述操作位置进行与所述虚拟场景的交互操作。
  21. 一种计算机存储介质,其特征在于,所述计算机存储介质存储有多条指令,所述指令适于由处理器加载并执行如权利要求1至10任意一项的方法步骤。
PCT/CN2019/109985 2018-11-09 2019-10-08 应用程序的虚拟场景识别与交互键位匹配方法及计算设备 WO2020093825A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP19881725.6A EP3879496A4 (en) 2018-11-09 2019-10-08 METHOD FOR DETECTION OF VIRTUAL SCENES AND FOR INTERACTION KEY POSITION ADAPTATION FOR APPLICATION PROGRAM AND COMPUTER DEVICE
JP2020563939A JP7221989B2 (ja) 2018-11-09 2019-10-08 アプリケーションの仮想シーン認識及びインタラクションキーマッチング方法、並びに計算装置
US17/085,649 US11511188B2 (en) 2018-11-09 2020-10-30 Virtual scene recognition and interaction key position matching method for application and computing device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811330926.1A CN109544663B (zh) 2018-11-09 2018-11-09 应用程序的虚拟场景识别与交互键位匹配方法及装置
CN201811330926.1 2018-11-09

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/085,649 Continuation US11511188B2 (en) 2018-11-09 2020-10-30 Virtual scene recognition and interaction key position matching method for application and computing device

Publications (1)

Publication Number Publication Date
WO2020093825A1 true WO2020093825A1 (zh) 2020-05-14

Family

ID=65846634

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/109985 WO2020093825A1 (zh) 2018-11-09 2019-10-08 应用程序的虚拟场景识别与交互键位匹配方法及计算设备

Country Status (5)

Country Link
US (1) US11511188B2 (zh)
EP (1) EP3879496A4 (zh)
JP (1) JP7221989B2 (zh)
CN (1) CN109544663B (zh)
WO (1) WO2020093825A1 (zh)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109544663B (zh) * 2018-11-09 2023-01-06 腾讯科技(深圳)有限公司 应用程序的虚拟场景识别与交互键位匹配方法及装置
CN111598976B (zh) * 2019-02-01 2023-08-22 华为技术有限公司 场景识别方法及装置、终端、存储介质
CN110227264B (zh) * 2019-06-06 2023-07-11 腾讯科技(成都)有限公司 虚拟对象控制方法、装置、可读存储介质和计算机设备
CN112657177A (zh) * 2019-10-16 2021-04-16 宏碁股份有限公司 游戏操作优化方法及移动装置
CN110860085B (zh) * 2019-11-14 2023-04-25 网易(杭州)网络有限公司 键鼠设置方法及装置
CN111510780B (zh) * 2020-04-10 2021-10-26 广州方硅信息技术有限公司 视频直播控制、桥接、流控、播控方法及客户端
CN112200712B (zh) * 2020-09-08 2023-10-27 成都安易迅科技有限公司 Gles图像渲染方法及装置、存储介质、计算机设备
CN112274916B (zh) * 2020-11-20 2024-05-31 杭州雾联科技有限公司 一种键鼠输入方法、装置、设备及介质
CN113140028A (zh) * 2021-04-08 2021-07-20 广州三七互娱科技有限公司 虚拟物体渲染方法、装置及电子设备
CN113617026B (zh) * 2021-08-27 2023-07-25 腾讯科技(深圳)有限公司 云游戏的处理方法、装置、计算机设备和存储介质
CN113617027B (zh) * 2021-08-27 2023-10-24 腾讯科技(深圳)有限公司 云游戏的处理方法、装置、设备以及介质
CN113975804B (zh) * 2021-10-29 2024-04-19 腾讯科技(深圳)有限公司 虚拟控件显示方法、装置、设备、存储介质及产品

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662466A (zh) * 2012-03-28 2012-09-12 深圳市兴达实电子有限公司 平板电脑游戏控制器系统及其控制方法
CN102707937A (zh) * 2011-08-29 2012-10-03 新奥特(北京)视频技术有限公司 一种图文系统中触摸屏菜单场景的实现方法及系统
CN102799359A (zh) * 2012-06-20 2012-11-28 宇龙计算机通信科技(深圳)有限公司 终端和终端操控方法
CN103823673A (zh) * 2014-02-12 2014-05-28 陈昱 云计算系统终端实现gtk图形界面与云端应用对接的方法
CN106909547A (zh) * 2015-12-22 2017-06-30 北京奇虎科技有限公司 基于浏览器的图片加载方法及装置
CN109544663A (zh) * 2018-11-09 2019-03-29 腾讯科技(深圳)有限公司 应用程序的虚拟场景识别与交互键位匹配方法及装置

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070037625A1 (en) * 2005-06-28 2007-02-15 Samsung Electronics Co., Ltd. Multiplayer video gaming system and method
WO2009026198A1 (en) * 2007-08-20 2009-02-26 Double Fusion, Inc. Independently-defined alteration of output from software executable using later-integrated code
US8041860B2 (en) * 2008-12-15 2011-10-18 Cywee Group Limited Method for producing a mapping tool, a PC game having the mapping tool and operation method therefore
US8539039B2 (en) * 2010-06-22 2013-09-17 Splashtop Inc. Remote server environment
US20120086630A1 (en) * 2010-10-12 2012-04-12 Sony Computer Entertainment Inc. Using a portable gaming device to record or modify a game or application in real-time running on a home gaming system
EP2661666A4 (en) * 2011-01-05 2015-01-21 Razer Asia Pacific Pte Ltd SYSTEMS AND METHODS FOR MANAGING, SELECTING, AND UPDATING CONTENT FOR THE VISUAL INTERFACE USING DISPLAY KEYPADS, AND / OR OTHER USER INPUT DEVICES
KR101759537B1 (ko) * 2011-08-17 2017-07-31 (주)경문엔터테인먼트 화면터치식 버튼의 키매핑방법
US9671874B2 (en) * 2012-11-08 2017-06-06 Cuesta Technology Holdings, Llc Systems and methods for extensions to alternative control of touch-based devices
CN103092612B (zh) * 2012-12-31 2016-12-28 深圳天珑无线科技有限公司 实现安卓操作系统3d桌面贴图的方法及电子装置
US9751005B1 (en) * 2013-12-13 2017-09-05 Aftershock Services, Inc. Facilitating map navigation in an online game
CN105513038B (zh) * 2014-10-20 2019-04-09 网易(杭州)网络有限公司 图像匹配方法及手机应用测试平台
CN104469465A (zh) * 2014-12-09 2015-03-25 四川长虹电器股份有限公司 基于安卓系统的智能电视悬浮主场景交互方法
CN104599319B (zh) * 2014-12-26 2017-05-31 福建工程学院 一种3d场景的实时生成方法
US9857939B2 (en) * 2015-02-27 2018-01-02 Accenture Global Services Limited Three-dimensional virtualization
CN104740872B (zh) * 2015-04-13 2018-06-19 北京奇虎科技有限公司 模拟安卓环境中游戏程序运行控制方法及装置
US10062181B1 (en) * 2015-07-30 2018-08-28 Teradici Corporation Method and apparatus for rasterizing and encoding vector graphics
US10338673B2 (en) * 2015-09-16 2019-07-02 Google Llc Touchscreen hover detection in an augmented and/or virtual reality environment
CN105641931A (zh) * 2016-03-31 2016-06-08 深圳市创想天空科技股份有限公司 游戏操作配置方法与系统
CN106027753B (zh) * 2016-04-27 2020-01-10 广州百田信息科技有限公司 一种点读故事机的控制方法及装置
KR20180111397A (ko) * 2017-04-02 2018-10-11 둘툰 주식회사 외부 입력장치를 이용한 게임 가상 컨트롤러 생성 및 매핑 방법
US10238960B2 (en) * 2017-04-26 2019-03-26 Microsoft Technology Licensing, Llc Dual input multilayer keyboard
CN107297073B (zh) * 2017-06-15 2022-11-01 广州华多网络科技有限公司 外设输入信号的模拟方法、装置及电子设备
CN107320956B (zh) * 2017-08-08 2018-07-06 腾讯科技(深圳)有限公司 一种游戏界面生成方法及装置
CN107789830A (zh) 2017-09-15 2018-03-13 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及存储介质
CN107909641A (zh) * 2017-10-26 2018-04-13 广州市雷军游乐设备有限公司 一种烘焙渲染方法、装置、终端设备及存储介质
CN107635078B (zh) * 2017-10-30 2020-12-22 Oppo广东移动通信有限公司 游戏控制方法及设备
CN108176048B (zh) * 2017-11-30 2021-02-19 腾讯科技(深圳)有限公司 图像的处理方法和装置、存储介质、电子装置
CN107970562A (zh) * 2017-12-06 2018-05-01 中南林业科技大学 用于划船机的虚拟现实游戏实现方法
US11383164B2 (en) * 2018-02-23 2022-07-12 Rovi Guides, Inc. Systems and methods for creating a non-curated viewing perspective in a video game platform based on a curated viewing perspective
CN108564646B (zh) * 2018-03-28 2021-02-26 腾讯科技(深圳)有限公司 对象的渲染方法和装置、存储介质、电子装置
EP3782023A4 (en) * 2018-04-04 2021-12-29 Nicolades, Ian Scripting engine and implementations
CN108579082A (zh) * 2018-04-27 2018-09-28 网易(杭州)网络有限公司 游戏中显示光影的方法、装置和终端

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102707937A (zh) * 2011-08-29 2012-10-03 新奥特(北京)视频技术有限公司 一种图文系统中触摸屏菜单场景的实现方法及系统
CN102662466A (zh) * 2012-03-28 2012-09-12 深圳市兴达实电子有限公司 平板电脑游戏控制器系统及其控制方法
CN102799359A (zh) * 2012-06-20 2012-11-28 宇龙计算机通信科技(深圳)有限公司 终端和终端操控方法
CN103823673A (zh) * 2014-02-12 2014-05-28 陈昱 云计算系统终端实现gtk图形界面与云端应用对接的方法
CN106909547A (zh) * 2015-12-22 2017-06-30 北京奇虎科技有限公司 基于浏览器的图片加载方法及装置
CN109544663A (zh) * 2018-11-09 2019-03-29 腾讯科技(深圳)有限公司 应用程序的虚拟场景识别与交互键位匹配方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3879496A4 *

Also Published As

Publication number Publication date
US11511188B2 (en) 2022-11-29
JP7221989B2 (ja) 2023-02-14
EP3879496A1 (en) 2021-09-15
EP3879496A4 (en) 2022-01-05
CN109544663A (zh) 2019-03-29
CN109544663B (zh) 2023-01-06
JP2021524094A (ja) 2021-09-09
US20210046382A1 (en) 2021-02-18

Similar Documents

Publication Publication Date Title
WO2020093825A1 (zh) 应用程序的虚拟场景识别与交互键位匹配方法及计算设备
US7536655B2 (en) Three-dimensional-model processing apparatus, three-dimensional-model processing method, and computer program
WO2015032282A1 (zh) 电子设备硬件性能的测试方法及装置
US9937415B1 (en) Virtual controller for touchscreen
JP2022019748A (ja) 複合現実において動的仮想コンテンツを生成するデバイスおよび方法
US20120242664A1 (en) Accelerometer-based lighting and effects for mobile devices
CN105531756A (zh) 信息处理装置、信息处理方法和计算机程序
US11951390B2 (en) Method and system for incremental topological update within a data flow graph in gaming
WO2016136380A1 (ja) 情報処理システム及びプログラム、サーバ、端末、並びに媒体
US20230405452A1 (en) Method for controlling game display, non-transitory computer-readable storage medium and electronic device
US10872455B2 (en) Method and portable electronic device for changing graphics processing resolution according to scenario
US20190060754A1 (en) Storage medium, information processing apparatus, image processing method, and information processing system
WO2022022729A1 (zh) 渲染控制方法、设备以及系统
CN112799801B (zh) 一种模拟鼠标指针绘制方法、装置、设备和介质
TW202138971A (zh) 互動方法、裝置、互動系統、電子設備及存儲介質
US11769293B2 (en) Camera motion estimation method for augmented reality tracking algorithm and system therefor
CN111462269A (zh) 图像处理方法及装置、存储介质及电子设备
US20220249955A1 (en) Method and system for automatic normal map detection and correction
KR100692210B1 (ko) 게임 엔진에서 소정의 오브젝트를 렌더링하는 방법 및 상기랜더링 방법을 수행하는 프로그램이 기록된 기록 매체
TWI815021B (zh) 用於擴增實境之深度計算裝置及深度計算方法
Madsen et al. OpenGL Game Development By Example
WO2024093609A1 (zh) 一种叠加光遮挡的渲染方法、装置及相关产品
KR20240054780A (ko) 객체 포즈 보정 방법 및 시스템
CN116541110A (zh) 信息提示方法、信息提示装置、存储介质与电子设备
CN115089965A (zh) 用于场景中的路径呈现的方法、装置、设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19881725

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020563939

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019881725

Country of ref document: EP

Effective date: 20210609