CN114733197A - Information processing method and device in game, readable storage medium and electronic equipment - Google Patents
Information processing method and device in game, readable storage medium and electronic equipment Download PDFInfo
- Publication number
- CN114733197A CN114733197A CN202210389237.8A CN202210389237A CN114733197A CN 114733197 A CN114733197 A CN 114733197A CN 202210389237 A CN202210389237 A CN 202210389237A CN 114733197 A CN114733197 A CN 114733197A
- Authority
- CN
- China
- Prior art keywords
- game
- lens
- user interface
- display area
- graphical user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The disclosure relates to the technical field of human-computer interaction, and provides an information processing method and device in a game, a computer readable storage medium and an electronic device. Wherein, the method comprises the following steps: displaying a first picture in a graphical user interface of terminal equipment, wherein the terminal equipment comprises a camera module; responding to shielding operation of a lens of the camera module, and determining a target display area in the graphical user interface according to the shielding condition of the lens; and switching the content displayed in the target display area from a local picture in the first picture to a target picture. According to the scheme, the game picture displayed in the graphical user interface can be locally switched and displayed based on the shielding operation of the user on the lens of the camera module of the terminal equipment, so that the interactivity of the game can be enhanced, and the game experience of the user is improved.
Description
Technical Field
The present disclosure relates to the field of human-computer interaction technologies, and in particular, to an in-game information processing method, an in-game information processing apparatus, a computer-readable storage medium, and an electronic device.
Background
At present, most games display game pictures corresponding to fictive game scenes in the game process. Although some games have AR (Augmented Reality) functions, they can only be directly switched from a virtual game scene to a real scene after a user clicks a certain entry button, and the corresponding game picture is also directly switched from the virtual game scene picture to the real scene picture, that is, the virtual game scene picture or the real scene picture is displayed in the graphical user interface, but the virtual game scene picture and the real scene picture cannot be displayed at the same time.
Obviously, the game pictures displayed in the prior art are single, and the switching display jumping among the game pictures of different types is too large, the interactivity is not strong, and the game experience of the user is influenced.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a method and an apparatus for displaying a game screen, a computer-readable storage medium, and an electronic device, so as to improve the problems of poor interactivity and poor user experience when displaying a game screen at least to a certain extent.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, an information processing method in a game is provided, which is applied to a terminal device, where the terminal device includes a camera module, and the method includes: displaying a first picture in a graphical user interface of the terminal equipment; responding to shielding operation of a lens of the camera module, and determining a target display area in the graphical user interface according to the shielding condition of the lens; and switching the content displayed in the target display area from a local picture in the first picture to a target picture.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the determining, by responding to an occlusion operation for a lens of the camera module, a target display area in the graphical user interface according to an occlusion condition of the lens includes: and responding to partial shielding operation of the lens of the camera module, and determining the target display area in the graphical user interface according to the shielding condition of the lens.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the determining, in the graphical user interface, the target display area according to the occlusion condition of the lens includes: determining the target display area according to the corresponding display area of the shielded part of the lens in the graphical user interface; or determining the target display area according to the corresponding display area of the part of the lens which is not shielded in the user interface.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the first picture is a real scene picture generated by a lens of the camera module, and the target picture is a virtual scene picture corresponding to the game; or the first picture is a virtual scene picture corresponding to the game, and the target picture is a real scene picture generated through a lens of the camera module; or the first picture is a virtual scene picture corresponding to the game, and the target picture is a scene thumbnail corresponding to the virtual scene of the game.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the determining, by responding to an occlusion operation for a lens of the camera module, a target display area in the graphical user interface according to an occlusion condition of the lens includes: responding to the shielding operation of a lens of the camera module, and determining a color block area formed by the shielded part of the lens in the first picture; and determining the target display area in the graphical user interface according to the color block area.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, before determining the target display region in the graphical user interface according to the color block region, the method further includes: adjusting the color of the patch region such that the patch region is visually distinguishable on the graphical user interface.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the method further includes: and responding to the adjustment operation of the shielding area of the lens of the camera module, and adjusting the display size of the target display area in the graphical user interface.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the method further includes: and controlling a target game role in the game to execute a target game action in response to the display size of the target display area in the graphical user interface meeting a first preset condition.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the game includes a decryption-type game; the method further comprises the following steps: and responding to the display size of the target display area in the graphical user interface meeting a second preset condition, and displaying a decryption clue and/or a decryption element corresponding to the second preset condition in the graphical user interface.
According to a second aspect of the present disclosure, there is provided an information processing apparatus in a game, applied to a terminal device, the terminal device including a camera module, the apparatus including: a game screen display module configured to display a first screen in a graphical user interface of the terminal device; the target display area determining module is configured to respond to shielding operation of a lens of the camera module and determine a target display area in the graphical user interface according to shielding conditions of the lens; a partial game screen switching module configured to switch the content displayed in the target display area from a partial screen in the first screen to a target screen.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of processing information in a game as described in the first aspect of the embodiments above.
According to a fourth aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including: a processor; and a storage device for storing one or more programs, which when executed by the one or more processors, cause the one or more processors to implement the in-game information processing method according to the first aspect of the above embodiments.
As can be seen from the foregoing technical solutions, the in-game information processing method, the in-game information processing apparatus, and the computer-readable storage medium and the electronic device for implementing the in-game information processing method in the exemplary embodiments of the present disclosure have at least the following advantages and positive effects:
in the technical solutions provided by some embodiments of the present disclosure, a target display area may be determined in a graphical user interface of a terminal device according to a shielding condition of a lens of a camera module of the terminal device in response to a shielding operation of the lens, and then a partial picture in a first picture displayed in the graphical user interface of the terminal device is switched to a target picture with respect to content displayed in the target display area. Compared with the prior art, the method and the device have the advantages that the shielding operation of the camera module of the terminal equipment is responded, the local picture in the first picture displayed in the graphical user interface can be switched to the target picture, so that the richness of game picture display is improved, and the game interaction experience of a user is enhanced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 shows a flow diagram of a method of information processing in a game in an exemplary embodiment of the present disclosure;
FIG. 2 illustrates a flow diagram of a method of determining a target display area in an exemplary embodiment of the present disclosure;
FIG. 3 illustrates a screen displayed after a partial screen in a game screen is switched to a target screen in an exemplary embodiment of the present disclosure;
fig. 4 illustrates content displayed in a graphical user interface when a camera module of a terminal device is not enabled in an exemplary embodiment of the disclosure;
FIG. 5 illustrates a flow diagram of a decryption method of decrypting a class game in an exemplary embodiment of the present disclosure;
FIG. 6 illustrates a game screen displayed in a graphical user interface in an exemplary embodiment of the present disclosure;
fig. 7 shows a schematic configuration diagram of an information processing apparatus in a game in an exemplary embodiment of the present disclosure;
FIG. 8 shows a schematic diagram of a structure of a computer storage medium in an exemplary embodiment of the present disclosure;
fig. 9 shows a schematic structural diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first" and "second", etc. are used merely as labels, and are not limiting on the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
Currently, most games display game pictures corresponding to fictional game scenes in the game process, and players directly perform game operations in virtual game scenes displayed in a graphical user interface of a terminal device. The AR function is set in part of games, and a player can click a certain entry button to realize switching from a virtual game scene to a real scene, so that game operation can be performed in the real scene.
In the related art, although the AR function is set in the game, the game scene is directly skipped to the real scene, and the game screen displayed in the graphical user interface is directly and completely switched from the virtual game scene screen to the real scene screen, and the virtual game scene screen and the real scene screen cannot be simultaneously displayed, so that the interaction between the virtual game scene and the real scene is less, and the interactivity between the operation of the user and the display of the game screen is insufficient, which affects the game experience of the user.
In an embodiment of the present disclosure, there is provided an information processing method in a game, which overcomes, at least to some extent, the above-mentioned drawbacks in the related art.
Fig. 1 is a flowchart illustrating an information processing method in a game in an exemplary embodiment of the present disclosure, which may be applied to a terminal device including a camera module. Referring to fig. 1, the method includes:
step S110, displaying a first picture in a graphical user interface of the terminal equipment;
step S120, responding to the shielding operation of the lens of the camera module, and determining a target display area in the graphical user interface according to the shielding condition of the lens;
step S130, switching the content displayed in the target display area from a partial screen in the first screen to a target screen.
In the technical solution provided in the embodiment shown in fig. 1, in response to a shielding operation for a lens of a camera module of a terminal device, a target display area is determined in a graphical user interface of the terminal device according to a shielding condition of the lens, and then a partial picture in a first picture displayed in the graphical user interface of the terminal device is switched to a target picture according to content displayed in the target display area. Compared with the prior art, the shielding operation of the camera module of the terminal equipment can switch the local picture in the first picture displayed in the graphic user interface into the target picture, so that the richness of game picture display can be improved, and the game interaction experience of a user can be enhanced.
The following detailed description of the various steps in the example shown in fig. 1:
in step S110, a first screen is displayed in a graphical user interface of the terminal device.
In the present disclosure, the terminal device may include a mobile terminal, such as a smart phone, a tablet computer, a wearable electronic device, and any other terminal device having a display function and a camera module. The camera module can be understood as the equipment for collecting images in the terminal equipment. The camera module of the terminal device may have one or more lenses for image capture.
Illustratively, the first screen may be a virtual scene screen corresponding to the game. Based on this, one specific implementation of step S110 may include: and displaying a virtual scene picture corresponding to the virtual game scene of the game in a graphical user interface of the terminal equipment.
For example, the first picture may be a real scene picture generated by a lens of the camera module. Based on this, another specific implementation of step S110 may include: and displaying the real scene picture acquired by the camera module in a graphical user interface of the terminal equipment.
In other words, before a lens of the camera module of the terminal device is not blocked, the first picture displayed in the graphical user interface of the terminal device may be a virtual scene picture corresponding to a virtual game scene of the game, or a real scene picture in a real environment where the terminal device is currently located, which is acquired by the camera module of the terminal device. The present exemplary embodiment is not particularly limited in this regard.
In an optional implementation manner, in response to a first trigger operation of logging in the game through the terminal device, a virtual scene picture corresponding to a virtual game scene of the game is displayed in a graphical user interface of the terminal device through a first canvas.
In an optional implementation manner, in response to a second trigger operation for enabling the camera module of the terminal device, the first canvas is covered by a second canvas, so that a real scene picture shot by a lens of the camera module is displayed in the second canvas.
For example, when a user logs in a game through a terminal device, the game server or the terminal device may first use a first canvas, for example, a first canvas (canvas used by a game development engine to display game pictures) to display a virtual scene picture corresponding to a virtual game scene rendered by the game engine. Then, the User can start the camera module of the terminal device according to the prompt or the requirement of the User, and when the User starts the camera of the terminal device, the User can use a second canvas on a User Interface (User Interface) layer, such as another second canvas, to display a picture of a real scene shot by the camera module of the terminal device. Wherein the second canvas is overlaid on top of the first canvas to completely block the virtual game scene displayed by the first canvas, so that the real scene picture can be displayed by the second canvas.
That is to say, after the user logs in the game through the game account, before the user did not start the module of making a video recording of terminal equipment, what displayed among terminal equipment's the graphical user interface was the virtual scene picture, after the user started the module of making a video recording, what displayed among terminal equipment's the graphical user interface just can be the real scene picture that the module of making a video recording shot.
In another exemplary embodiment, the starting operation of the enabled state of the camera module may also be determined without the selection of the player, for example, after the player opens or logs in the game, the camera module may be automatically enabled by default at the same time, that is, the camera module is directly in the enabled state by default, and when the player does not want to use the function of the camera module, the camera module may be correspondingly turned off, so that the camera module is changed from the default enabled state to the disabled state. The present exemplary embodiment is not particularly limited to this.
For example, in the present disclosure, after the user logs in the game through the terminal device, the real scene picture collected by the camera module is displayed in the graphical user interface of the terminal device, for example, when the user logs in the game, the real scene picture collected by the camera module of the terminal device is displayed in the graphical user interface by using one canvas 1. Then the user can close the module of making a video recording according to the demand selection of oneself, after the user closed the module of making a video recording, can use a canvas 2 to cover canvas 1, then show the virtual game scene picture that the game engine was rendered in canvas 2.
For example, after the user closes the camera module, the camera module may be enabled again according to a requirement or a prompt, which is not particularly limited in this exemplary embodiment.
In another exemplary embodiment, even if the camera module of the terminal device is in an enabled state, in a case that the lens of the camera module of the terminal device is not blocked, only the virtual game scene screen may be displayed in the graphical user interface of the terminal device, which is not particularly limited in this exemplary embodiment.
Next, in step S120, in response to a shielding operation for the lens of the camera module, a target display area is determined in the graphical user interface according to a shielding condition of the lens.
In an optional implementation manner, a specific implementation manner of step S120 may be that, when the camera module is in an enabled state, in response to an occlusion operation for a lens of the camera module, a target display area is determined in the graphical user interface according to an occlusion condition of the lens.
In another alternative embodiment, a specific implementation manner of step S120 may be that, in response to a partial shielding operation for a lens of a camera module of the terminal device, a target display area is determined in the graphical user interface according to a shielding condition of the lens.
In other words, the blocking operation for the lens of the camera module may include a partial blocking operation for the lens of the camera module. Of course, the blocking operation for the lens of the camera module may be a total blocking operation.
In an optional implementation manner, determining a target display area in the graphical user interface according to the occlusion condition of the lens includes: and determining a target display area according to the corresponding display area of the shielded part of the lens in the graphical user interface.
For example, a display area corresponding to the blocked part of the lens in the graphical user interface may be identified, and then the display area may be determined as the target display area.
In another optional implementation, determining a target display area in the graphical user interface according to the lens occlusion condition includes: and determining a target display area according to the corresponding display area of the part of the lens which is not shielded in the user interface.
For example, after the display area corresponding to the part of the lens which is occluded in the graphical user interface is identified, other areas except the display area in the graphical user interface may be determined as the target display area.
For example, fig. 2 is a flowchart illustrating a method for determining a target display area according to an exemplary embodiment of the disclosure. Referring to fig. 2, the method may include steps S210 to S220. Wherein:
in step S210, in response to a blocking operation for a lens of the camera module, a color block region formed in the first screen by a blocked portion of the lens is determined.
For example, when a user, i.e., a game player, partially blocks a lens of a camera module of a terminal device, for example, when a part of the lens is blocked by a finger, the finger is very close to the lens of the camera module, so that the part blocked by the finger appears as a large dark color block in a first picture.
Next, in step S220, the target display area is determined in the graphical user interface according to the color block area.
For example, before the target display region is determined in the graphical user interface according to the color-block region, the color of the color-block region may be adjusted, so that the color-block region has a visual distinction on the graphical user interface.
For example, the color of the patch region may be adjusted to a preset color. The preset color may be white, or may be another color that can be distinguished from the uncovered portion, which is not particularly limited in this exemplary embodiment.
For example, the color of the target color patch may be converted to white #000000 with a code. Therefore, the corresponding display area of the part of the lens which is shaded in the graphical user interface and the display area of the part which is not shaded in the graphical user interface can be distinguished.
For example, determining the target display area according to the color block area may include: and determining the color block area as a target display area. Namely, the display area of the part of the lens which is shielded and mapped in the graphical user interface is determined as the target display area.
For example, determining the target display according to the color block region may also include: and determining a display area outside the color block area as a target display area, namely determining a display area in which the part of the lens which is not shielded is mapped in the graphical user interface as the target display area.
Through the steps S210 to S220, the target display area can be determined according to the color block area corresponding to the part of the lens which is shielded in the graphical user interface.
In another exemplary embodiment, the game server or the game client may detect whether a lens of a camera module of the terminal device where the logged game account is located is blocked in real time when the camera module of the terminal device is in a enabled state, and when the lens of the camera module is blocked, may also determine a first ratio between an area of the blocked part of the lens and a total area of the lens, and divide the graphical user interface according to the first ratio, so as to divide the graphical user interface into a first display area and a second display area, and make a display ratio of the area of the first display area obtained after division in the graphical user interface equal to the first ratio, so as to determine the first display area or the second display area as the target display area.
In another exemplary embodiment, the graphical user interface may also be divided into a first display area and a second display area according to a second ratio between the area of the portion of the lens that is not blocked and the total area of the lens, so that the display proportion of the first display area in the graphical user interface is equal to the second ratio, and the first display area or the second display area is determined as the target display area.
Of course, the target display area may also be determined in other manners, which is not limited in this exemplary embodiment.
Next, in step S130, the content displayed in the target display area is switched from the partial screen in the first screen to the target screen.
In an exemplary embodiment, when the first picture is a virtual scene picture of the game, the target picture is a real scene picture generated by a lens of the camera module. That is, in step S110, if a virtual scene picture corresponding to a game is currently displayed in the graphical user interface of the terminal device, a specific implementation manner of step S130 may be to switch the content displayed in the target display area from a partial picture in the virtual scene picture to a real scene picture acquired through a part where the lens is not occluded.
For example, if a virtual scene picture is currently displayed in a graphical user interface of the terminal device, a target display area can be determined according to an unshielded part in a lens in response to a partial shielding operation of the lens of the camera module of the terminal device, and then a real scene picture collected through the unshielded part of the lens is displayed in the target display area. Therefore, the virtual game scene picture and the real scene picture can be displayed simultaneously in the graphical user interface of the terminal equipment.
For example, switching the content displayed in the target display area from a partial picture in the virtual scene picture to a real scene picture acquired through a part where the lens is not occluded may include: and taking a display area outside the color block area corresponding to the part of the lens which is blocked as a target display area, namely taking the part of the lens which is not blocked as the target display area. Then, the target display area is used as a mask (i.e., a mask) for displaying the real scene canvas. Therefore, the display area of the part of the lens which is shielded and mapped in the graphical user interface displays a virtual game scene picture, and the display area of the part of the lens which is not shielded and mapped in the graphical user interface displays a real scene picture. That is, the target display area shows a real scene picture, and the area outside the target display area shows a picture that is still a virtual game scene picture.
In another exemplary embodiment, when the first picture is a real scene picture generated through a lens of the camera module, the target picture is a virtual scene picture corresponding to the game. That is, in step S110, if the first screen displayed in the gui of the terminal device is the real scene screen formed through the lens of the camera module, a specific implementation manner of step S130 may be to switch the content displayed in the target display area from a partial screen in the real scene screen to a virtual scene screen in the game.
For example, if a real scene picture is currently displayed in the graphical user interface of the terminal device, a target display area can be determined according to a part of a blocked lens in response to a partial blocking operation of the lens of the camera module of the terminal device, and then a virtual scene picture shot by the game engine is displayed in the target display area. In this way, it is also possible to simultaneously display a virtual game scene picture and a real scene picture in the graphical user interface of the terminal device.
For example, switching the content displayed in the target display area from a partial picture in the real scene picture to a virtual scene picture in the game may include: and determining a color block area corresponding to the part of the lens which is shielded as a target display area. Then, the target display area is used as a mask (i.e. a mask) for displaying the virtual game scene canvas, so that the display area in which the part of the shot which is shielded is mapped in the graphical user interface is displayed as a virtual game scene picture, and the display area in which the part of the shot which is not shielded is mapped in the graphical user interface is displayed as a real scene picture. That is, the target display area shows a virtual scene picture, and the area outside the target display area shows a picture that is still a real scene picture.
In another exemplary embodiment, when the first screen is a virtual scene screen corresponding to the game, the target screen is a scene thumbnail corresponding to a virtual game scene of the game. That is, in step S110, if the game screen displayed in the graphical user interface of the terminal device is a virtual scene screen corresponding to a virtual game scene of the game, a specific implementation manner of step S130 may be that the switching the content displayed in the target display area from a partial screen in the game screen to a target screen includes: and switching the content displayed in the target display area from a partial picture in the virtual scene picture to a scene thumbnail corresponding to the virtual game scene.
The scene thumbnail corresponding to the virtual game scene can be understood as a game map.
For example, if a virtual scene picture is currently displayed in a graphical user interface of the terminal device, in response to a partial shielding operation for a lens of a camera module of the terminal device, a target display area may be determined according to a shielded portion in the lens, or a target display area may be determined according to an unshielded portion in the lens, and then a game map is displayed in the target display area. In this way, the game map corresponding to the virtual game scene and the virtual game scene screen can be displayed simultaneously on the graphical user interface of the terminal device.
For example, switching the content displayed in the target display area from a partial screen in the virtual scene screen to a scene thumbnail corresponding to the virtual game scene may include: the color block area corresponding to the part of the lens which is blocked in the first picture is used as a mask (namely a shade) for displaying the game map canvas, or the display area of the part of the lens which is blocked out in the first picture except the color block area corresponding to the first picture is used as the mask for displaying the game map canvas. In this way, the part of the lens which is blocked or not blocked can still be displayed as the virtual game scene picture, and the other part is displayed as the game map, so that the virtual game scene and the game map can be simultaneously displayed in the graphical user interface. And then according to the requirements of the user, the game map can be displayed through the shielding operation of the user on the lens, and the game interaction experience of the user is improved.
In an exemplary embodiment, when a virtual game scene picture and a real scene picture are simultaneously displayed in a graphical user interface in response to a partial shielding operation for a lens of the camera module, the camera module being in an on state may be understood as that a shooting function of the camera module is in an on state in a game, and the camera module being in an off state may be understood as that the shooting function of the camera module is in an off state in the game.
When the partial shielding operation of the lens of the camera module is responded, and the virtual game scene picture and the game map corresponding to the virtual game scene in the game are simultaneously displayed in the graphical user interface, the camera module is in the starting state, the map starting function of the camera module can be understood to be in the starting state, and the camera module is in the non-starting state, the map starting function of the camera module can be understood to be in the non-starting state.
In other words, the lens of the camera module can be used as a switch for starting the game map in the game, when the map starting function of the camera module is in the starting state, the game map corresponding to the virtual game scene of the game can be displayed in the graphical user interface according to the shielding operation of the user on the lens of the camera module, so that the game operation of the user can be facilitated without adding any control, and the game experience of the user can be improved.
For example, when the virtual game scene picture and the real scene picture are simultaneously displayed in the graphical user interface according to the partial shielding operation for the lens of the camera module, the lens in step S120 may be understood as an imaging camera in the camera module, i.e., a main camera lens. In general, when the main camera lens is completely shielded, the graphic user interface is integrally displayed as a dark color block, that is, imaging cannot be performed; when the main shooting lens is partially shielded, a dark color block is displayed in a first imaging area corresponding to the shielded part in the graphical user interface, and a shot real scene picture can be normally displayed in a second imaging area corresponding to the unshielded part in the graphical user interface. That is, at this time, one part of the graphical user interface displays dark color blocks, and the other part displays real scene pictures shot by the camera; however, when the non-main shooting lens is completely or partially shielded, the real scene picture collected by the camera shooting module can be completely displayed in the graphical user interface.
In an exemplary embodiment, when the virtual game scene screen and the game map corresponding to the virtual game scene in the game are simultaneously displayed in the graphical user interface according to the partial shielding operation of the lens of the camera module, the lens in step S120 may be a main imaging lens in the camera module or a sub-imaging camera in the camera module.
In an exemplary embodiment, the information processing method in the game shown in fig. 1 may further include: and responding to the adjustment operation of the shielding area of the lens of the camera module, and adjusting the display size of the target display area in the graphical user interface.
For example, in the process of shielding the lens, a game player can adjust the shielding area of the lens according to requirements, and when the shielding area of the lens changes, the size of a corresponding target color block of the shielded part in the graphical user interface also changes, so that the display size of the determined target display area in the graphical user interface can be adjusted in real time according to the change condition of the shielding area of the lens.
In an exemplary embodiment, the information processing method in the game shown in fig. 1 may further include: and controlling a target game role in the game to execute a target game action in response to the display size of the target display area in the graphical user interface meeting a first preset condition.
In an alternative embodiment, the first preset condition may include any one of that the display size of the target display area is greater than a first preset value, or less than or equal to the first preset value, or that the display ratio of the target display area and the non-target display area in the graphical user interface is greater than the first preset value, or less than or equal to the first preset value, and the like.
The first preset condition may be associated with the target game character performing the target game action in advance. In the game process, a user can adjust the display size occupied by the target display area in the graphical user interface by adjusting the shielding size of the lens, and when the display size of the target display area is adjusted to meet a first preset condition, the target game role can be automatically controlled to execute the target game action. Therefore, the control of the target game role can be linked with the shielding operation of the user, an operation mode for controlling the game role to execute the game action is provided, and the game interaction experience of the user is improved.
Illustratively, the game in the present disclosure may include a decryption-type game, and based on this, the information processing method in the game illustrated in fig. 1 may further include: and responding to the display size of the target display area in the graphical user interface meeting a second preset condition, and displaying a decryption clue and/or a decryption element corresponding to the second preset condition in the graphical user interface.
In an alternative embodiment, the second preset condition may include that the display size of the target display area is greater than a second preset value or less than or equal to the second preset value, or that the display ratio between the target display area and a non-target display area excluding the target display area in the graphical user interface is greater than a first preset value or less than or equal to the second preset value, or the like.
The second predetermined condition may be associated with a certain decryption element and/or decryption hint in advance. In the game process, a user can adjust the display size occupied by the target display area in the graphical user interface by adjusting the shielding size of the lens, and when the display size of the target display area is adjusted to meet a second preset condition, a decryption element and/or a decryption clue associated with the second preset condition can be displayed in the graphical user interface.
Next, an information processing method in a game according to the present disclosure will be further described with reference to fig. 3 to 7.
In an exemplary embodiment, as described above, when a user logs in a game, a first canvas (a canvas used for displaying a game screen in a game development engine) may be used to display a virtual game scene screen corresponding to a virtual game scene of the game rendered by a game engine; when the user selects to start the camera module of the terminal device, for example, to start the main imaging lens, another second canvas may be used to display the picture taken by the main imaging lens on the UI layer. Wherein the second canvas is superimposed over the first canvas such that the second canvas completely obscures the virtual game scene. Therefore, after the main imaging lens is started, if the main imaging lens does not have the shielded part, the picture of the real scene shot by the main imaging lens can be displayed in the whole graphical user interface.
In the game process, when a player covers a part of the main imaging lens, the part covered by the player in the main imaging lens is represented as a large dark color block in the graphical user interface, the color block can be converted into a preset color block, such as the white color block, and then the white color block is used as a mask for displaying a virtual game scene canvas, so that a real scene picture displayed in a display area corresponding to the graphical user interface by the part covered by the player in the main imaging lens can be switched and displayed as a virtual game scene picture, and a real scene picture shot by a camera is still displayed in the display area corresponding to the graphical user interface by the part not covered by the lens.
As shown in fig. 3, the virtual game scene screen 31 and the real scene screen 32 are displayed simultaneously with a dotted line as a boundary in fig. 3. Furthermore, the virtual game scene picture and the real scene picture shot by the lens of the camera module can be simultaneously displayed in the graphical user interface based on the shielding operation of the user on the lens of the camera module, so that the interaction between the virtual game scene and the real scene is realized on the basis.
The mask is referred to as a mask and is also referred to as a mask. The part of the shot which is shielded in the display area corresponding to the graphical user interface and the part of the canvas which is overlapped with the color block generates an irregular graph which is the mask, and then the mask is used as a new canvas to display the virtual game scene.
It should be noted that, the purpose of converting the dark color block corresponding to the shielded part of the main imaging lens into the white color block is to clearly distinguish the region corresponding to the shielded part of the lens in the first picture from the real scene picture captured by the main imaging lens, and may not perform color block color conversion, which is not particularly limited in this exemplary embodiment.
In an exemplary embodiment, since the canvas displaying the real scene is superimposed over the canvas displaying the virtual game scene and completely covers the canvas displaying the virtual game scene, when the camera module is in the enabled state, if the lens is completely covered, the entire graphical user interface completely displays the virtual game scene picture, and if the lens does not have a covered portion, the entire graphical user interface completely displays the real scene picture captured by the lens.
And when the camera module is not started, the virtual game scene picture of the game is displayed in the graphical user interface. As shown in fig. 4, when the camera module is not turned on, the game screen displayed on the gui is a virtual game scene screen.
For example, after the user logs in the game, before the user does not start the camera module, the camera module is in an unopened state, that is, the shooting function of the camera module is not started in the game, so that the real scene picture shot by the camera module cannot be displayed in the graphical user interface. At this time, no matter the lens of the camera module is completely shielded, partially shielded or has no shielded part, only the virtual game scene picture is displayed in the whole graphical user interface.
After the user opens the camera module, the camera module is in an open state, and the real scene picture shot by the camera module can be displayed in the graphical user interface at the moment, and the virtual game scene picture and the real scene picture shot by the lens of the camera module can be simultaneously displayed in the graphical user interface according to the shielding operation of the user on the lens of the camera module.
When the lens of the camera module of the terminal equipment has a blocked part, scene pictures in the real world within the visual field range of the camera module of the terminal equipment can be collected through the unblocked part in the lens, and the real scene pictures collected through the unblocked part in the lens are correspondingly displayed in an imaging area of the graphical user interface in a mapping mode. The principle is similar to that of photographing, when a camera is used for photographing, if a certain part in a lens is shielded, the shielded part is correspondingly displayed as a dark color block in a mobile phone screen, and other areas except the dark color block in the mobile phone screen can continuously and normally display images collected by the part of the lens which is not shielded.
That is to say, the part of the lens that is not blocked is mapped to the first display area in the gui, and when the real scene picture collected by the camera module is displayed, the position of the first display area in the gui may be consistent with the position of the part of the lens that is not blocked in the whole lens, and if the left half part of the lens is not blocked, the first display area is on the left side of the gui, so as to correspondingly display the real scene picture collected by the lens on the left side of the gui.
And the part of the lens, which is shielded by the user, is mapped in a second display area in the graphical user interface, so that the virtual game scene picture can be displayed, and the virtual game scene picture and the real scene picture can be simultaneously displayed in the graphical user interface.
As described above, in the present disclosure, by adjusting the shielding area for the lens of the camera module, the corresponding display size of the target display area in the graphical user interface can be adjusted.
Through the operation of sheltering from to the camera lens, when simultaneously showing virtual game scene picture and reality scene picture in graphical user interface, can change the demonstration size of the recreation map that shows in graphical user interface through the adjustment operation to the area of sheltering from of the camera lens of module of making a video recording, further user's game operation of being convenient for promotes user's gaming experience.
When the virtual game scene picture and the real scene picture are simultaneously displayed in the graphical user interface through the shielding operation of the lens, when the display size of the target display area in the graphical user interface is changed, the display proportion between the display area for displaying the virtual game scene picture and the second display area for displaying the real scene picture in the graphical user interface is also changed. For some decryption games, some decryption clues and/or decryption elements can be set, virtual game scene pictures and real scene pictures need to be displayed simultaneously, and the virtual game scene pictures and the real scene pictures can be unlocked only by continuously adjusting the display proportion between the virtual game scene pictures and the real scene pictures, so that a player is guided to continuously shield the camera and move fingers to change the proportion for shielding the camera for playing.
Illustratively, fig. 5 shows a flowchart of a decryption method for decrypting a game in an exemplary embodiment of the present disclosure. Exemplarily, referring to fig. 5, the method may include step S510 to step S520. Wherein:
in step S510, in response to an adjustment operation for a shielding area of a lens of the camera module, adjusting a size of the target display area in the graphical user interface to change a display ratio between a virtual game scene picture and a real scene picture in the graphical user interface;
in step S520, in response to that the display ratio between the virtual game scene picture and the real scene picture satisfies a second preset condition, a decryption cue and/or a decryption element corresponding to the second preset condition is displayed in the graphical user interface.
In an exemplary embodiment, some decryption elements and/or decryption elements that require a preset display ratio between the virtual game scene picture and the real scene picture to be displayed may be pre-configured. As shown in fig. 6, when the display ratio between the virtual game scene picture and the real scene picture satisfies the preset condition, the decryption element 61 may be displayed in the real scene picture corresponding to the real scene. The player can execute the decryption operation according to the currently displayed decryption clue, and also can execute the corresponding game operation according to the currently displayed decryption element so as to obtain the decryption clue corresponding to the currently displayed decryption element, and then execute the game operation corresponding to the decryption clue to perform decryption.
In another exemplary embodiment, a decryption operation corresponding to the decryption element and/or the decryption hint displayed in the graphical user interface may be performed in response to the display ratio between the virtual game scene picture and the real scene picture satisfying a second preset condition.
For example, some decryption elements and/or decryption operations of decryption cues may be associated with the display ratio between the virtual game scene picture and the real scene picture in advance. Therefore, when the display proportion between the virtual game scene picture and the real scene picture reaches a preset value, the decryption operation corresponding to the decryption element and/or the decryption clue can be automatically executed, and then the picture after the decryption operation is executed on the decryption element and/or the decryption clue can be directly displayed in the graphical user interface.
Through the above steps S610 to S620, a new interactive mode can be provided for the decrypted game, and the player can control the display ratio of the virtual game scene picture and the real scene picture in the graphical user interface by changing the ratio of the shielding camera, and further decrypt some game clues in the game based on the change of the display ratio so as to promote the game decryption.
In the method, based on the shielding operation of the user on the lens in the camera module of the terminal equipment, the virtual game scene picture and the real scene picture can be displayed simultaneously in the graphical user interface, the interaction between the virtual game scene and the real scene is enhanced, and the game experience of the player is improved.
Those skilled in the art will appreciate that all or part of the steps to implement the above embodiments are implemented as a computer program executed by a CPU. The computer program, when executed by the CPU, performs the functions defined by the method provided by the present invention. The program may be stored in a computer readable storage medium, which may be a read-only memory, a magnetic or optical disk, or the like.
Furthermore, it should be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the method according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily appreciated that the processes illustrated in the above figures are not intended to indicate or limit the temporal order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Fig. 7 shows a schematic configuration diagram of an information processing apparatus in a game in an exemplary embodiment of the present disclosure. Referring to fig. 7, the apparatus 700 may be applied to a terminal device including a camera module, and may include a game screen display module 710, a target display area determination module 720, and a partial game screen switching module 730. Wherein:
a game screen display module 710 configured to display a first screen in a graphical user interface of the terminal device;
a target display area determining module 720, configured to respond to an occlusion operation for a lens of the camera module, and determine a target display area in the graphical user interface according to an occlusion condition of the lens;
a partial game screen switching module 730 configured to switch the content displayed in the target display area from a partial screen in the first screen to a target screen.
In an exemplary embodiment of the present disclosure, based on the foregoing embodiments, the target display area determining module 720 may be further specifically configured to: and responding to partial shielding operation of the lens of the camera module, and determining the target display area in the graphical user interface according to the shielding condition of the lens.
In an exemplary embodiment of the present disclosure, based on the foregoing embodiments, the target display area determining module 720 may be further specifically configured to: determining the target display area according to the corresponding display area of the shielded part of the lens in the graphical user interface; or determining the target display area according to the corresponding display area of the part of the lens which is not shielded in the user interface.
In an exemplary embodiment of the present disclosure, based on the foregoing embodiment, the first picture is a real scene picture generated by a lens of the camera module, and the target picture is a virtual scene picture corresponding to the game; or the first picture is a virtual scene picture corresponding to the game, and the target picture is a real scene picture generated through a lens of the camera module; or the first picture is a virtual scene picture corresponding to the game, and the target picture is a scene thumbnail corresponding to the virtual scene of the game.
In an exemplary embodiment of the present disclosure, based on the foregoing embodiments, the target display area determining module 720 may be further specifically configured to: responding to shielding operation of a lens of the camera module, and determining a color block area formed by the shielded part of the lens in a first picture; and determining the target display area in the graphical user interface according to the color block area.
In an exemplary embodiment of the present disclosure, based on the foregoing embodiment, the apparatus 700 may further include a color block color conversion module specifically configured to: and adjusting the color of the color block region before the target display region is determined in the graphical user interface according to the color block region, so that the color block region has visual distinction on the graphical user interface.
In an exemplary embodiment of the present disclosure, based on the foregoing embodiments, the apparatus further includes a target display area resizing module, which may be configured to: and responding to the adjustment operation of the shielding area of the lens of the camera module, and adjusting the display size of the target display area in the graphical user interface.
In an exemplary embodiment of the present disclosure, based on the foregoing embodiment, the apparatus further includes a target game action performing module, which may be configured to: and controlling a target game role in the game to execute a target game action in response to the display size of the target display area in the graphical user interface meeting a first preset condition.
In an exemplary embodiment of the present disclosure, based on the foregoing embodiment, the game includes a decryption-type game; the apparatus further comprises a decryption condition response module, which may be configured to: and responding to the display size of the target display area in the graphical user interface meeting a second preset condition, and displaying a decryption clue and/or a decryption element corresponding to the second preset condition in the graphical user interface.
The specific details of each module in the information processing apparatus in the game have been described in detail in the corresponding information processing method in the game, and therefore are not described herein again.
The information processing device in the game provided by the disclosure can switch the partial picture in the first picture displayed in the graphical user interface into the target picture by responding to the shielding operation of the camera module of the terminal equipment, thereby improving the richness of game picture display and enhancing the game interaction experience of users.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a mobile terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer storage medium capable of implementing the above method. On which a program product capable of implementing the above-described method of the present specification is stored. In some possible embodiments, various aspects of the present disclosure may also be implemented in the form of a program product including program code for causing a terminal device to perform the steps according to various exemplary embodiments of the present disclosure described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device, for example:
displaying a first picture in a graphical user interface of terminal equipment, wherein the terminal equipment comprises a camera module; responding to shielding operation of a lens of the camera module, and determining a target display area in the graphical user interface according to the shielding condition of the lens; and switching the content displayed in the target display area from a local picture in the first picture to a target picture.
Optionally, the responding to the shielding operation of the lens of the camera module, and determining a target display area in the graphical user interface according to the shielding condition of the lens includes: and responding to partial shielding operation of the lens of the camera module, and determining the target display area in the graphical user interface according to the shielding condition of the lens.
Optionally, the determining the target display area in the graphical user interface according to the shielding condition of the lens includes: determining the target display area according to the corresponding display area of the shielded part of the lens in the graphical user interface; or determining the target display area according to the corresponding display area of the part of the lens which is not shielded in the user interface.
Optionally, the first picture is a real scene picture generated by a lens of the camera module, and the target picture is a virtual scene picture corresponding to the game; or the first picture is a virtual scene picture corresponding to the game, and the target picture is a real scene picture generated through a lens of the camera module; or the first picture is a virtual scene picture corresponding to the game, and the target picture is a scene thumbnail corresponding to the virtual scene of the game.
Optionally, the responding to the shielding operation of the lens of the camera module of the terminal device and determining the target display area in the graphical user interface according to the shielding condition of the lens includes: responding to the shielding operation of a lens of the camera module, and determining a color block area formed by the shielded part of the lens in the first picture; and determining the target display area in the graphical user interface according to the color block area.
Optionally, before determining the target display area in the graphical user interface according to the color block area, the method further includes: adjusting the color of the patch regions such that there is a visual distinction of the patch regions on the graphical user interface.
Optionally, the method further includes: and responding to the adjustment operation of the shielding area of the lens of the camera module, and adjusting the display size of the target display area in the graphical user interface.
Optionally, the method further includes: and controlling a target game role in the game to execute a target game action in response to the display size of the target display area in the graphical user interface meeting a first preset condition.
Optionally, the game comprises a decryption-type game; the method further comprises the following steps: and responding to the display size of the target display area in the graphical user interface meeting a second preset condition, and displaying a decryption clue and/or a decryption element corresponding to the second preset condition in the graphical user interface.
Through the embodiment, on one hand, based on the shielding operation of the camera module of the terminal equipment, a local picture in a first picture displayed in the graphical user interface can be switched to a target picture, so that the richness of game picture display is improved, for example, a virtual game scene picture and a display scene picture can be displayed at the same time; on the other hand, the game picture displayed in the graphical user interface can be changed by the operation mode of shielding the lens of the camera module, so that the game interaction experience of the user is enhanced.
Referring to fig. 8, a program product 800 for implementing the above method according to an embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 900 according to this embodiment of the disclosure is described below with reference to fig. 9. The electronic device 900 shown in fig. 9 is only an example and should not bring any limitations to the functionality and scope of use of the embodiments of the present disclosure.
As shown in fig. 9, the electronic device 900 is embodied in the form of a general purpose computing device. Components of electronic device 900 may include, but are not limited to: the at least one processing unit 910, the at least one storage unit 920, a bus 930 connecting different system components (including the storage unit 920 and the processing unit 910), and a display unit 940.
Wherein the storage unit stores program code that is executable by the processing unit 910 to cause the processing unit 910 to perform steps according to various exemplary embodiments of the present disclosure described in the above section "exemplary method" of the present specification.
As another example, the processing unit 910 may also perform various steps of the method shown in fig. 5.
The storage unit 920 may include a readable medium in the form of a volatile storage unit, such as a random access storage unit (RAM)9201 and/or a cache storage unit 9202, and may further include a read only storage unit (ROM) 9203.
The electronic device 900 may also communicate with one or more external devices 1000 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 900, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 900 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 950. Also, the electronic device 900 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet) via the network adapter 960. As shown, the network adapter 960 communicates with the other modules of the electronic device 900 via the bus 930. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 900, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure, for example:
displaying a first picture in a graphical user interface of terminal equipment, wherein the terminal equipment comprises a camera module; responding to shielding operation of a lens of the camera module, and determining a target display area in the graphical user interface according to the shielding condition of the lens; and switching the content displayed in the target display area from a local picture in the first picture to a target picture.
Optionally, the responding to the shielding operation of the lens of the camera module, and determining a target display area in the graphical user interface according to the shielding condition of the lens includes: and responding to partial shielding operation of the lens of the camera module, and determining the target display area in the graphical user interface according to the shielding condition of the lens.
Optionally, the determining the target display area in the graphical user interface according to the shielding condition of the lens includes: determining the target display area according to the corresponding display area of the shielded part of the lens in the graphical user interface; or determining the target display area according to the corresponding display area of the part of the lens which is not shielded in the user interface.
Optionally, the first picture is a real scene picture generated by a lens of the camera module, and the target picture is a virtual scene picture corresponding to the game; or the first picture is a virtual scene picture corresponding to the game, and the target picture is a real scene picture generated through a lens of the camera module; or the first picture is a virtual scene picture corresponding to the game, and the target picture is a scene thumbnail corresponding to the virtual scene of the game.
Optionally, the responding to the shielding operation of the lens of the camera module, and determining a target display area in the graphical user interface according to the shielding condition of the lens includes: responding to shielding operation of a lens of the camera module, and determining a color block area formed by the shielded part of the lens in the first picture; and determining the target display area in the graphical user interface according to the color block area.
Optionally, before determining the target display area in the graphical user interface according to the color block area, the method further includes: adjusting the color of the patch region such that the patch region is visually distinguishable on the graphical user interface.
Optionally, the method further includes: and responding to the adjustment operation of the shielding area of the lens of the camera module, and adjusting the display size of the target display area in the graphical user interface.
Optionally, the method further includes: and responding to the condition that the display size of the target display area in the graphical user interface meets a first preset condition, and controlling a target game role in the game to execute a target game action.
Optionally, the game comprises a decryption-type game; the method further comprises the following steps: and responding to the display size of the target display area in the graphical user interface meeting a second preset condition, and displaying a decryption clue and/or a decryption element corresponding to the second preset condition in the graphical user interface.
Through the embodiment, on one hand, based on the shielding operation of the camera module of the terminal equipment in response, the local picture in the first picture displayed in the graphical user interface can be switched to the target picture, so that the richness of game picture display is improved, for example, a virtual game scene picture and a display scene picture can be displayed at the same time; on the other hand, the game picture displayed in the graphical user interface can be changed by the operation mode of shielding the lens of the camera module, so that the game interaction experience of the user is enhanced.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
Claims (12)
1. An information processing method in a game is applied to terminal equipment, and is characterized in that the terminal equipment comprises a camera module, and the method comprises the following steps:
displaying a first picture in a graphical user interface of the terminal equipment;
responding to shielding operation of a lens of the camera module, and determining a target display area in the graphical user interface according to the shielding condition of the lens;
and switching the content displayed in the target display area from a local picture in the first picture to a target picture.
2. The in-game information processing method according to claim 1, wherein the determining, in the graphical user interface, a target display area according to an occlusion situation of a lens in response to an occlusion operation of the lens of the camera module includes:
and responding to partial shielding operation of the lens of the camera module, and determining the target display area in the graphical user interface according to the shielding condition of the lens.
3. The method of claim 2, wherein the determining the target display area in the gui according to the lens occlusion condition comprises:
determining the target display area according to the corresponding display area of the shielded part of the lens in the graphical user interface; or
And determining the target display area according to the corresponding display area of the part of the lens which is not shielded in the user interface.
4. The in-game information processing method according to any one of claims 1 to 3, wherein the first screen is a real scene screen generated by a lens of the camera module, and the target screen is a virtual scene screen corresponding to the game; or
The first picture is a virtual scene picture corresponding to the game, and the target picture is a real scene picture generated through a lens of the camera module; or
The first picture is a virtual scene picture corresponding to the game, and the target picture is a scene thumbnail corresponding to the virtual scene of the game.
5. The method according to claim 1, wherein the determining a target display area in the gui according to the lens occlusion condition in response to the lens occlusion operation for the camera module comprises:
responding to shielding operation of a lens of the camera module, and determining a color block area formed by the shielded part of the lens in the first picture;
and determining the target display area in the graphical user interface according to the color block area.
6. The in-game information processing method according to claim 5, before determining the target display region in the graphical user interface based on the patch region, the method further comprising:
adjusting the color of the patch region such that the patch region is visually distinguishable on the graphical user interface.
7. The in-game information processing method according to claim 1, characterized by further comprising:
and responding to the adjustment operation of the shielding area of the lens of the camera module, and adjusting the display size of the target display area in the graphical user interface.
8. The in-game information processing method according to claim 7, characterized by further comprising:
and controlling a target game role in the game to execute a target game action in response to the display size of the target display area in the graphical user interface meeting a first preset condition.
9. The in-game information processing method according to claim 7, wherein the game includes a decryption-type game; the method further comprises the following steps:
and responding to the display size of the target display area in the graphical user interface meeting a second preset condition, and displaying a decryption clue and/or a decryption element corresponding to the second preset condition in the graphical user interface.
10. The utility model provides an information processing device in game, is applied to terminal equipment, its characterized in that, terminal equipment includes the module of making a video recording, the device includes:
a game screen display module configured to display a first screen in a graphical user interface of the terminal device;
the target display area determining module is configured to respond to shielding operation of a lens of the camera module and determine a target display area in the graphical user interface according to shielding conditions of the lens;
a partial game screen switching module configured to switch the content displayed in the target display area from a partial screen in the first screen to a target screen.
11. A computer-readable medium on which a computer program is stored, characterized in that the program, when executed by a processor, implements an in-game information processing method according to any one of claims 1 to 8.
12. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the in-game information processing method according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210389237.8A CN114733197A (en) | 2022-04-13 | 2022-04-13 | Information processing method and device in game, readable storage medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210389237.8A CN114733197A (en) | 2022-04-13 | 2022-04-13 | Information processing method and device in game, readable storage medium and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114733197A true CN114733197A (en) | 2022-07-12 |
Family
ID=82281615
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210389237.8A Pending CN114733197A (en) | 2022-04-13 | 2022-04-13 | Information processing method and device in game, readable storage medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114733197A (en) |
-
2022
- 2022-04-13 CN CN202210389237.8A patent/CN114733197A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109091861B (en) | Interactive control method in game, electronic device and storage medium | |
CN107168616B (en) | Game interaction interface display method and device, electronic equipment and storage medium | |
CN107977141B (en) | Interaction control method and device, electronic equipment and storage medium | |
CN113064684B (en) | Virtual reality equipment and VR scene screen capturing method | |
CN114297436A (en) | Display device and user interface theme updating method | |
CN113794917A (en) | Display device and display control method | |
CN111760272B (en) | Game information display method and device, computer storage medium and electronic equipment | |
CN108776544B (en) | Interaction method and device in augmented reality, storage medium and electronic equipment | |
CN111467803B (en) | Display control method and device in game, storage medium and electronic equipment | |
CN110798615A (en) | Shooting method, shooting device, storage medium and terminal | |
CN113350779A (en) | Game virtual character action control method and device, storage medium and electronic equipment | |
CN113082696A (en) | Display control method and device and electronic equipment | |
CN113342248A (en) | Live broadcast display method and device, storage medium and electronic equipment | |
CN113014939A (en) | Display device and playing method | |
CN112732089A (en) | Virtual reality equipment and quick interaction method | |
CN113938748A (en) | Video playing method, device, terminal, storage medium and program product | |
CN112866773A (en) | Display device and camera tracking method in multi-person scene | |
CN113827970B (en) | Information display method and device, computer readable storage medium and electronic equipment | |
CN114157889A (en) | Display device and touch-control assistance interaction method | |
CN113440844A (en) | Information processing method and device suitable for game and electronic equipment | |
CN114733197A (en) | Information processing method and device in game, readable storage medium and electronic equipment | |
WO2020248682A1 (en) | Display device and virtual scene generation method | |
CN114053706A (en) | Game interface visual angle processing method, device, equipment and storage medium | |
CN114733196A (en) | Game scene control method, game scene control device, medium, and electronic device | |
CN110853643A (en) | Method, device, equipment and storage medium for voice recognition in fast application |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |