CN114082188A - Image fusion processing method, system, equipment and computer readable storage medium - Google Patents

Image fusion processing method, system, equipment and computer readable storage medium Download PDF

Info

Publication number
CN114082188A
CN114082188A CN202111419666.7A CN202111419666A CN114082188A CN 114082188 A CN114082188 A CN 114082188A CN 202111419666 A CN202111419666 A CN 202111419666A CN 114082188 A CN114082188 A CN 114082188A
Authority
CN
China
Prior art keywords
instruction
forming
state
fusion processing
trigger instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111419666.7A
Other languages
Chinese (zh)
Inventor
李阳
王晓峰
沈鑫鑫
周西伟
邓宗亮
张宇
刘景�
丁玉衡
龚圣轲
林腾达
刘锐
刘玉
刘子豪
王静
魏涛
徐梦娇
张鑫
赵新宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Mihoyo Tianming Technology Co Ltd
Original Assignee
Shanghai Mihoyo Tianming Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Mihoyo Tianming Technology Co Ltd filed Critical Shanghai Mihoyo Tianming Technology Co Ltd
Priority to CN202111419666.7A priority Critical patent/CN114082188A/en
Publication of CN114082188A publication Critical patent/CN114082188A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6653Methods for processing data by generating or executing the game program for rendering three dimensional images for altering the visibility of an object, e.g. preventing the occlusion of an object, partially hiding an object

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention relates to the field of video stream image processing technologies, and in particular, to an image fusion processing method, system, device, and computer-readable storage medium. The image fusion processing method comprises the following steps: forming a first control instruction in a state that the trigger instruction is matched with a prefabricated instruction in a state that the target body is associated with the display control unit; and the display control adjusts the display state of the target body under the action of the first control instruction.

Description

Image fusion processing method, system, equipment and computer readable storage medium
Technical Field
The present invention relates to the field of video stream image processing technologies, and in particular, to an image fusion processing method, system, device, and computer-readable storage medium.
Background
The third person says that the single-view game emphasizes the sense of action, and the leading role is visible on the game screen. It is deeply pursued by players with a wider field of view and more exciting motion effects. In the third person called visual angle game, different special effects can be loaded in game characters, meanwhile, the game characters need to complete different actions to complete game tasks, players observe and control the game characters through an overlooking visual angle and other angle visual angles in the game, and when the game characters are close to the positions of near visual angle points, the players cannot accurately control the game characters due to the shielding of the special effects. Schematically, when a game character is positioned in front of the view point and the game character has a special effect at present, when a player controls the game character to complete shooting, swinging a knife and other actions, the special effect is loaded around the game character, so that the player cannot determine the current hand position of the game character, the action of the player cannot be accurately controlled, and the game experience of the player is reduced.
Disclosure of Invention
In one aspect, the present application provides an image fusion processing method, including:
forming a first control instruction in a state that the trigger instruction is matched with a prefabricated instruction in a state that the target body is associated with the display control unit;
and the display control adjusts the display state of the target body under the action of the first control instruction.
As a further preferred embodiment, the image fusion processing method described above, wherein, in a state where the target body is associated with the display control unit, forming the first control instruction in a state where the trigger instruction matches the pre-made instruction specifically includes:
reading a first parameter between a first characteristic body and a second characteristic body, and forming the trigger instruction according to the first parameter;
and forming the first control instruction under the state that the trigger instruction is matched with a first threshold value.
As a further preferred embodiment, the image fusion processing method described above, wherein, in a state where the target body is associated with the display control unit, forming the first control instruction in a state where the trigger instruction matches the pre-made instruction specifically includes:
reading a second parameter between a current target body and the first characteristic, and forming the trigger instruction according to the second parameter;
and forming the first control instruction under the condition that the trigger instruction is matched with a second threshold value.
As a further preferred embodiment, the image fusion processing method described above, wherein, in a state where the target body is associated with the display control unit, forming the first control instruction in a state where the trigger instruction matches the pre-made instruction specifically includes:
reading the current dynamic characteristic parameters, and forming the trigger instruction according to the dynamic characteristic parameters;
and forming the first control instruction under the condition that the trigger instruction is matched with a third threshold value.
On the other hand, the present application further provides an image fusion processing system, which includes:
the first control instruction forming unit is used for forming a first control instruction in a state that the trigger instruction is matched with the prefabricated instruction in a state that the target body is associated with the display control unit;
and the display control unit adjusts the display state of the target body under the action of the first control instruction.
Preferably, in the image fusion processing system, the first control instruction forming unit specifically includes:
the first trigger instruction forming device reads a first parameter between a first characteristic body and a second characteristic body and forms the trigger instruction according to the first parameter;
and the first judgment device forms the first control instruction under the condition that the trigger instruction is matched with a first threshold value.
Preferably, in the image fusion processing system, the first control instruction forming unit specifically includes:
the second trigger instruction forming device is used for reading a second parameter between the current target body and the first characteristic and forming the trigger instruction according to the second parameter;
and the second judgment device forms the first control instruction under the condition that the trigger instruction is matched with a second threshold value.
Preferably, in the image fusion processing system, the first control instruction forming unit specifically includes:
the third trigger instruction forming device reads the current dynamic characteristic parameter and forms the trigger instruction according to the dynamic characteristic parameter;
and the third judgment device forms the first control instruction under the condition that the trigger instruction is matched with a third threshold value.
In another aspect, the present application further provides a video game fusion processing apparatus, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the computer program, when executed by the processor, implements the steps of the video game fusion processing method according to any one of the above aspects.
Finally, the present application further provides a computer-readable storage medium, in which a video game fusion processing program is stored, and when being executed by a processor, the video game fusion processing program realizes the steps of the video game fusion processing method according to any one of the above items.
Compared with the prior art, the beneficial effects of this application are:
and forming a first control instruction in a state that the trigger instruction is matched with the preset instruction, for example, the vision of a player is influenced by the current special effect, judging that the trigger instruction is matched with the preset instruction to form the first control instruction, adjusting the display mode of the current special effect under the action of the first control instruction, improving the transparency of the special effect, and enabling the player to clearly observe and control the character in a state that the special effect is displayed in the display mode with the adjusted high transparency.
Drawings
Fig. 1 is a schematic flowchart of an image fusion processing method according to an embodiment of the present invention;
fig. 2 is a comparison diagram of the effects of an image fusion processing method according to an embodiment of the present invention;
fig. 3 is a schematic flowchart of an image fusion processing method according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of an image fusion processing method according to an embodiment of the present invention;
fig. 5 is a schematic flowchart of an image fusion processing method according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an apparatus according to an embodiment of the present invention.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in itself. Thus, "module", "component" or "unit" may be used mixedly.
Example one
On one hand, the application provides an image fusion processing method, which is applied to a 3D rendering engine, a special effect (for example, halo, weapon use effect, etc.) is formed in a 3D rendering process through a Prefab data packet, and the image fusion processing method disclosed by the present disclosure is recorded in the Prefab data packet, specifically:
as shown in fig. 1, in step S110, in a state of display control associated with a target body and/or a target action, a first control instruction is formed in a state that the trigger instruction matches a preset instruction; the target body may be a special effect, for example, a player adds a protection shield to the current character, the protection shield can be displayed in a special effect manner, and the special effect manner display area can be regarded as the protection range of the protection shield. The target motion may be shooting, drawing bow, waving knife, etc., and the target motion may be set by the user, which is not limited specifically herein.
And establishing a binding relationship between the target body and/or the target action and the display control so as to change the current display mode under the condition that the target body and/or the target action influences the visual effect of the player.
The trigger command can be formed by external input, and can also be formed by the state of the current game character in the game scene.
And step S120, adjusting the display state of the target body under the action of the first control instruction. And forming a first control instruction in a state that the trigger instruction is matched with a preset instruction, wherein the control instruction aims at changing the display mode of the current special effect.
Taking a specific embodiment, taking a special effect as an example, the special effect is added into a game in a manner of loading Prefab (prefabricated part), and a correlation between an object and display control is established in a special effect prefabricated part, and schematically, when a preset condition is met, the prefabricated part controls a display manner of the special effect (for example, transparency of the special effect is improved, or color of the special effect is adjusted), and schematically, material transparency in the special effect may be adjusted to be implemented, as shown in fig. 2.
And forming a first control instruction in a state that the trigger instruction is matched with the preset instruction, for example, the vision of a player is influenced by the current special effect, judging that the trigger instruction is matched with the preset instruction to form the first control instruction, adjusting the display mode of the current special effect under the action of the first control instruction, improving the transparency of the special effect, and enabling the player to clearly observe and control the character in a state that the special effect is displayed in the display mode with the adjusted high transparency.
As shown in fig. 3, as a further preferred embodiment, the above image fusion processing method, wherein in the state of the target-related display control unit, the step S110 of forming the first control instruction in the state of the trigger instruction matching the pre-prepared instruction specifically includes:
step S11011, reading a first parameter between a first characteristic body and a second characteristic body, and forming the trigger instruction according to the first parameter; the object is a special effect, the first feature can be defined as a view angle lens, the second feature can be positioned as a main body of the game role, such as a game character, and the first parameter can be a relative included angle between a connecting line formed by the first feature and the second feature and the visual front view direction. Illustratively, when the first parameter is 0 °, it can be understood that the second feature is located right in front of the user, and when the first parameter is 60 °, the included angle between the bent angle viewing angle and the second feature is 60 °.
Step S11012, forming the first control instruction in a state where the trigger instruction matches a first threshold.
The first threshold may be 50 ° to 70 °. And when the trigger instruction is in a state of 50-70 degrees, judging that part of the vision of the current player is blocked, and forming a first control instruction at the moment, wherein the control instruction aims at controlling the special effect to be in a semitransparent state.
As shown in fig. 4, as a further preferred embodiment, the above image fusion processing method, wherein the step S110 of forming a first control command in a state that the trigger command matches a preset command in a state that the target-related display control unit is in a state that the trigger command matches the preset command includes:
step S11021, reading a second parameter between a current target body and the first characteristic, and forming the trigger instruction according to the second parameter; the target may be a special effect. The first feature can be defined as a view lens, and the second parameter is a distance between the object and the first feature.
Step S11022, forming the first control instruction in a state where the trigger instruction matches a second threshold.
And when the distance between the target body and the first characteristic body is smaller than the second threshold value, judging that partial vision of the current player is blocked, and forming a first control instruction at the moment, wherein the control instruction aims at controlling the special effect to be in a semitransparent state.
It should be noted that the first parameter and the second parameter may be formed simultaneously or separately, and the formation of the first control instruction may be triggered in a state where the first parameter matches the first threshold or the second parameter matches the second threshold.
As shown in fig. 5, as a further preferred embodiment, the above image fusion processing method, wherein the step S110 of forming the first control command in a state that the trigger command matches the pre-prepared command in a state that the target-associated display control unit is in a state that the trigger command matches the pre-prepared command includes:
step S11031, reading the current dynamic characteristic parameter, and forming the trigger instruction according to the dynamic characteristic parameter; the dynamic characteristic parameter may include an action parameter or an event parameter formed by an externally input control instruction.
The action parameters include at least a shooting-type action. The event parameter can be formed by the control instruction output by an external controller, and the external controller at least comprises a touch screen, a keyboard and a handle. The event parameters can be defined by the player or the system.
Step S11032, forming the first control instruction in a state where the trigger instruction matches a third threshold.
The third threshold value can be a preset action positioned by a user, and when the trigger instruction is matched with the preset action, the special effect is controlled to work in a semi-transparent display state.
It should be noted that, when the trigger instruction is not in the state of the first threshold, the second threshold, or the third threshold, it is determined that the current state of the game character does not affect the visual effect of the user, and therefore, no change is made.
It should be noted that the special effect includes not only the special effect body, but also a particle object matching the special effect. When the display state of the special effect needs to be adjusted, the display state of the special effect body and the particle object is adjusted at the same time, so that the influence of the distance on each part of the special effect is finely controlled, and the effect is more controllable.
Example two
On the other hand, the present application further provides an image fusion processing system, which includes:
the first control instruction forming unit is used for forming a first control instruction in a state that the trigger instruction is matched with the prefabricated instruction in a state that the target body is associated with the display control unit;
and the display control unit adjusts the display state of the target body under the action of the first control instruction.
As a further preferred embodiment, in the image fusion processing system, the first control instruction forming unit specifically includes:
the first trigger instruction forming device reads a first parameter between a first characteristic body and a second characteristic body and forms the trigger instruction according to the first parameter;
and the first judgment device forms the first control instruction under the condition that the trigger instruction is matched with a first threshold value.
As a further preferred embodiment, in the image fusion processing system, the first control instruction forming unit specifically includes:
the second trigger instruction forming device is used for reading a second parameter between the current target body and the first characteristic and forming the trigger instruction according to the second parameter;
and the second judgment device forms the first control instruction under the condition that the trigger instruction is matched with a second threshold value.
As a further preferred embodiment, in the image fusion processing system, the first control instruction forming unit specifically includes:
the third trigger instruction forming device reads the current dynamic characteristic parameter and forms the trigger instruction according to the dynamic characteristic parameter;
and the third judgment device forms the first control instruction under the condition that the trigger instruction is matched with a third threshold value.
The working principle of an image fusion processing system is the same as that of the image fusion processing method, and is not described herein again.
EXAMPLE III
In another aspect, the present application further provides a video game fusion processing device, which includes a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the computer program, when executed by the processor, implements the steps of the video game fusion processing method according to any one of the above aspects, and includes:
forming a first control instruction in a state that the trigger instruction is matched with a prefabricated instruction in a state that the target body is associated with the display control unit;
and the display control adjusts the display state of the target body under the action of the first control instruction.
As shown in fig. 6, the apparatus specifically includes: one or more processors 610, one processor 610 being exemplified in fig. 6; the memory 620 is used for storing one or more programs, and when the one or more programs are executed by the one or more processors 610, the one or more processors 610 are enabled to implement the mixed reality interaction method according to any embodiment of the present invention. The processor 610 and the memory 620 may be connected by a bus or other means, such as the bus connection shown in FIG. 6.
The memory 620, which is a computer-readable storage medium, may be used to store software programs, computer-executable programs, and modules, such as program instructions corresponding to the mixed reality interaction method in the embodiment of the present invention (for example, the reception of signals in various directions, the measurement of field strength, the determination of one-way delay, and the acquisition of TA information). The processor 610 executes various functional applications of the device and data processing by executing software programs, instructions and modules stored in the memory 620, that is, implements the mixed reality interaction method described above.
The memory 620 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the device, and the like. Further, the memory 620 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory 620 can further include memory located remotely from the processor 610, which can be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Example four
Finally, the present application further provides a computer-readable storage medium, in which a video game fusion processing program is stored, and when the video game fusion processing program is executed by a processor, the method for implementing the video game fusion processing method includes:
forming a first control instruction in a state that the trigger instruction is matched with a prefabricated instruction in a state that the target body is associated with the display control unit;
and the display control adjusts the display state of the target body under the action of the first control instruction.
Of course, the computer-readable storage medium provided by the embodiment of the present invention has computer-executable instructions that are not limited to the method operations described above, and may also perform related operations in the mixed reality interaction method provided by any embodiment of the present invention.
From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly, can also be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods according to the embodiments of the present invention.
It should be noted that, in the embodiment of the above search apparatus, each included unit and module are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. An image fusion processing method is characterized by comprising the following steps:
forming a first control instruction in a state that the trigger instruction is matched with a prefabricated instruction in a state that the target body is associated with the display control unit;
and the display control unit adjusts the display state of the target body under the action of the first control instruction.
2. The image fusion processing method according to claim 1, wherein forming the first control instruction in a state where the trigger instruction matches the pre-made instruction in a state where the target-associated display control unit specifically includes:
reading a first parameter between a first characteristic body and a second characteristic body, and forming the trigger instruction according to the first parameter;
and forming the first control instruction under the state that the trigger instruction is matched with a first threshold value.
3. The image fusion processing method according to claim 1, wherein forming the first control instruction in a state where the trigger instruction matches the pre-made instruction in a state where the target-associated display control unit specifically includes:
reading a second parameter between a current target body and the first characteristic, and forming the trigger instruction according to the second parameter;
and forming the first control instruction under the condition that the trigger instruction is matched with a second threshold value.
4. The image fusion processing method according to claim 1, wherein forming the first control instruction in a state where the trigger instruction matches the pre-made instruction in a state where the target-associated display control unit specifically includes:
reading the current dynamic characteristic parameters, and forming the trigger instruction according to the dynamic characteristic parameters;
and forming the first control instruction under the condition that the trigger instruction is matched with a third threshold value.
5. An image fusion processing system, comprising:
the first control instruction forming unit is used for forming a first control instruction in a state that the trigger instruction is matched with the prefabricated instruction in a state that the target body is associated with the display control unit;
and the display control unit adjusts the display state of the target body under the action of the first control instruction.
6. The image fusion processing system according to claim 5, wherein the first control instruction forming unit specifically includes:
the first trigger instruction forming device reads a first parameter between a first characteristic body and a second characteristic body and forms the trigger instruction according to the first parameter;
and the first judgment device forms the first control instruction under the condition that the trigger instruction is matched with a first threshold value.
7. The image fusion processing system according to claim 5, wherein the first control instruction forming unit specifically includes:
the second trigger instruction forming device is used for reading a second parameter between the current target body and the first characteristic and forming the trigger instruction according to the second parameter;
and the second judgment device forms the first control instruction under the condition that the trigger instruction is matched with a second threshold value.
8. The image fusion processing system according to claim 5, wherein the first control instruction forming unit specifically includes:
the third trigger instruction forming device reads the current dynamic characteristic parameter and forms the trigger instruction according to the dynamic characteristic parameter;
and the third judgment device forms the first control instruction under the condition that the trigger instruction is matched with a third threshold value.
9. A video game blending device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program when executed by the processor implementing the steps of the video game blending method as claimed in any one of claims 1 to 4.
10. A computer readable storage medium having a video game fusion processing program stored thereon, the video game fusion processing program when executed by a processor implementing the steps of the video game fusion processing method according to any one of claims 1 to 4.
CN202111419666.7A 2021-11-26 2021-11-26 Image fusion processing method, system, equipment and computer readable storage medium Pending CN114082188A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111419666.7A CN114082188A (en) 2021-11-26 2021-11-26 Image fusion processing method, system, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111419666.7A CN114082188A (en) 2021-11-26 2021-11-26 Image fusion processing method, system, equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN114082188A true CN114082188A (en) 2022-02-25

Family

ID=80304836

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111419666.7A Pending CN114082188A (en) 2021-11-26 2021-11-26 Image fusion processing method, system, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN114082188A (en)

Similar Documents

Publication Publication Date Title
US11977713B2 (en) Viewing angle adjustment method and device, electronic device, and computer-readable storage medium
US10702776B2 (en) Information processing method, storage medium and electronic device
WO2018177170A1 (en) Display control method and apparatus for game picture, storage medium and electronic device
US10013812B2 (en) Method and system for controlling a head-mounted display system
US11883741B2 (en) Information processing method and apparatus, mobile terminal, and storage medium
JP7137804B2 (en) Method and system for gesture-based interaction
WO2018018857A1 (en) Gesture control method and apparatus applied to vr device
US11166002B2 (en) Method and device for live broadcasting virtual avatar
US20180025531A1 (en) Method of providing virtual space, method of providing virtual experience, program and recording medium therefor
US8998718B2 (en) Image generation system, image generation method, and information storage medium
US10456684B2 (en) Attention control for information processing apparatus, information processing system, information processing method, non-transitory computer-readable storage medium storing information processing program
US8212813B2 (en) Image generating apparatus, method of generating image, program, and recording medium
US6231440B1 (en) Computer game apparatus
CN108211342A (en) Visual angle regulating method and device, storage medium and electronic device
CN106873886B (en) Control method and device for stereoscopic display and electronic equipment
CN108553891A (en) Object method of sight and device, storage medium and electronic device
US11117052B2 (en) Game device, control method of game device, and storage medium that can be read by computer
CN108854063A (en) Method of sight, device, electronic equipment and storage medium in shooting game
CN111784844B (en) Method and device for observing virtual object, storage medium and electronic equipment
US20140354631A1 (en) Non-transitory storage medium encoded with computer readable information processing program, information processing apparatus, information processing system, and information processing method
CN114159787A (en) Control method and device of virtual object, electronic equipment and readable medium
WO2020114154A1 (en) Animation control method and device, storage medium, and electronic device
CN109107153A (en) The method and device of game skill release, electronic equipment, storage medium
WO2021143290A1 (en) Method and apparatus for displaying virtual prop, storage medium and electronic device
CN114082188A (en) Image fusion processing method, system, equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination