CN112870694A - Virtual scene picture display method and device, electronic equipment and storage medium - Google Patents

Virtual scene picture display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112870694A
CN112870694A CN202110301140.2A CN202110301140A CN112870694A CN 112870694 A CN112870694 A CN 112870694A CN 202110301140 A CN202110301140 A CN 202110301140A CN 112870694 A CN112870694 A CN 112870694A
Authority
CN
China
Prior art keywords
virtual
prop
item
picture
display area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110301140.2A
Other languages
Chinese (zh)
Other versions
CN112870694B (en
Inventor
崔维健
仇蒙
田聪
何晶晶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110301140.2A priority Critical patent/CN112870694B/en
Publication of CN112870694A publication Critical patent/CN112870694A/en
Application granted granted Critical
Publication of CN112870694B publication Critical patent/CN112870694B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Abstract

The application provides a picture display method, a picture display device, electronic equipment and a storage medium of a virtual scene; the method comprises the following steps: responding to a trigger instruction aiming at a virtual item in a virtual scene, and controlling a virtual object to enter the inside of the virtual item; responding to an operation instruction of the virtual object for the virtual prop, controlling the virtual prop to be in an operation state, and presenting a visual field picture when the virtual object is in the virtual prop in the operation state; presenting a prop display area of the virtual prop in the visual field picture, and presenting a picture of the virtual prop in the virtual scene in the prop display area; through the application, better immersion and experience can be brought to the user, the user can know the operation state of the virtual prop and the surrounding virtual scenes at any time conveniently, multiple times of interactive operation is not needed, the human-computer interaction efficiency is improved, and the occupation of hardware processing resources is reduced.

Description

Virtual scene picture display method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of virtualization and human-computer interaction technologies, and in particular, to a method and an apparatus for displaying a virtual scene, an electronic device, and a storage medium.
Background
With the development of computer technology, electronic devices can realize richer and more vivid virtual scenes. The virtual scene refers to a digital scene outlined by a computer through a digital communication technology, and a user can obtain a fully virtualized feeling (for example, virtual reality) or a partially virtualized feeling (for example, augmented reality) in the aspects of vision, hearing and the like in the virtual scene, and simultaneously can interact with various objects in the virtual scene or control the virtual objects in the virtual scene to interact based on virtual props to obtain feedback.
In the related art, in the process that a user controls a virtual object to enter a virtual item for control, a terminal always only presents a view picture in the virtual item, and the user can only view a local virtual item and a part of a virtual scene through a limited view size. When a user wants to check the virtual prop, the operation state of the virtual prop and the scene around the virtual prop, multiple interactive operations need to be executed to realize the virtual prop or the virtual prop cannot be realized at all, so that the human-computer interaction efficiency is low, and the perception of the user on the impression and the dynamic sense of the virtual prop is weak.
Disclosure of Invention
The embodiment of the application provides a picture display method and device for a virtual scene, electronic equipment and a storage medium, which can bring better immersion and experience for a user, facilitate the user to know the operation state of a virtual prop and the surrounding virtual scene at any time, avoid multiple interactive operations, improve the human-computer interaction efficiency and reduce the occupation of hardware processing resources.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a picture display method of a virtual scene, which comprises the following steps:
responding to a trigger instruction aiming at a virtual item in a virtual scene, and controlling a virtual object to enter the inside of the virtual item;
responding to an operation instruction of the virtual object for the virtual prop, controlling the virtual prop to be in an operation state, and presenting a visual field picture when the virtual object is in the virtual prop in the operation state;
in the visual field picture, presenting a prop display area of the virtual prop, and
and in the item display area, presenting a picture of the virtual item in the virtual scene.
An embodiment of the present application further provides a device for displaying a virtual scene, including:
the control module is used for responding to a trigger instruction aiming at a virtual item in a virtual scene and controlling a virtual object to enter the interior of the virtual item;
the first presentation module is used for responding to an operation instruction of the virtual object for the virtual prop, controlling the virtual prop to be in an operation state, and presenting a view picture when the virtual object is in the virtual prop in the operation state;
and the second presentation module is used for presenting a prop display area of the virtual prop in the view picture and presenting a picture of the virtual prop in the virtual scene in the prop display area.
In the above scheme, the first presentation module is further configured to present, in the operating state, a view picture obtained by observing with a first person view as an observation view when the virtual object is inside the virtual prop;
correspondingly, the second presenting module is further configured to present, in the prop display area, a picture of the virtual prop in the virtual scene by using a third person weighing view angle.
In the above solution, the view frame includes: a first sub-picture corresponding to the inside of the virtual prop and a second sub-picture corresponding to the outside of the virtual prop; the second presentation module is further configured to present a prop display area of the virtual prop in a first sub-picture corresponding to the inside of the virtual prop.
In the above scheme, the second presentation module is further configured to present, in the item display area, a picture of the virtual item in the virtual scene by using a display style corresponding to an item attribute of the virtual item;
wherein the prop attributes include at least one of: and the electric quantity attribute and the protection attribute of the virtual prop.
In the above scheme, the second presentation module is further configured to obtain a protection attribute value of the virtual item when the item attribute is a protection attribute;
and when the protection attribute value belongs to a target protection attribute value interval, presenting a picture of the virtual prop in the virtual scene by adopting a fuzzy display style.
In the above scheme, the second presentation module is further configured to, when the property attribute is an electric quantity attribute, obtain an electric quantity attribute value of the virtual property;
and when the electric quantity attribute value belongs to the target electric quantity attribute value interval, adopting a preset transparency display style to present a picture of the virtual prop in the virtual scene.
In the above scheme, the second presenting module is further configured to present, in the view screen, a hidden control for the prop display area;
correspondingly, the second presentation module is further configured to receive a region hiding instruction for the prop display region, where the region hiding instruction is triggered based on the hiding control;
and hiding the displayed prop display area of the presented virtual prop in response to the area hiding instruction.
In the above scheme, the second presenting module is further configured to present, in the view screen, an area expansion control corresponding to the prop display area;
correspondingly, the second presentation module is further configured to receive a region expansion instruction for the prop display region, which is triggered based on the region expansion control;
and responding to the area expansion instruction, and presenting a hidden prop display area of the virtual prop.
In the above scheme, the apparatus further comprises:
and the switching module is used for responding to the triggering operation aiming at the prop display area, and switching the displayed visual field picture when the virtual object is positioned in the virtual prop into the picture when the virtual prop is positioned in the virtual scene.
In the foregoing solution, the switching module is further configured to present a sub-picture independent of the picture, and
and presenting a visual field picture corresponding to the outside of the virtual prop when the virtual object is positioned in the virtual prop in the sub-picture.
In the above scheme, the apparatus further comprises:
the third presentation module is used for presenting a display control corresponding to the prop display area;
responding to the trigger operation aiming at the display control, and controlling the state of the display control to be an opening state;
correspondingly, the second presentation module is further configured to present, in the view screen, a prop display area of the virtual prop when the state of the display control is the open state.
In the above scheme, the control module is further configured to obtain a property attribute value corresponding to a property attribute of the virtual property; wherein the prop attributes include at least one of: the electric quantity attribute and the protection attribute of the virtual prop;
and when the property value of the prop is lower than the property value threshold value, controlling the virtual object to exit from the interior of the virtual prop and canceling the prop display area of the virtual prop.
In the above scheme, the second presentation module is further configured to present, in the view screen, an exit control corresponding to the virtual item;
correspondingly, the control module is further configured to control the virtual object to exit from the inside of the virtual item and cancel the presented item display area of the virtual item in response to the triggering operation for the exit control.
In the above scheme, the control module is further configured to present a use control for entering the inside of the virtual item;
and responding to a triggering instruction aiming at the virtual prop in the virtual scene based on the triggering of the use control, and controlling the virtual object to enter the inside of the virtual prop.
In the above scheme, the second presentation module is further configured to present, in the item display area, attribute indication information corresponding to an item attribute of the virtual item;
wherein the prop attributes include at least one of: the electric quantity attribute and the protection attribute of the virtual prop; and the attribute indication information is used for indicating the available state corresponding to the property of the prop.
An embodiment of the present application further provides an electronic device, including:
a memory for storing executable instructions;
and the processor is used for realizing the picture display method of the virtual scene provided by the embodiment of the application when the executable instructions stored in the memory are executed.
The embodiment of the present application further provides a computer-readable storage medium, which stores executable instructions, and when the executable instructions are executed by a processor, the method for displaying the virtual scene image provided by the embodiment of the present application is implemented.
The embodiment of the application has the following beneficial effects:
when the virtual object is controlled to enter the inside of the virtual prop and the virtual prop is controlled to be in an operation state, a visual field picture when the virtual object is in the inside of the virtual prop in the operation state is presented, and a prop display area of the virtual prop is presented in the visual field picture, so that the picture of the virtual prop in a virtual scene is presented through the prop display area.
Therefore, the visual field picture when the virtual object is in the virtual prop in the operating state is presented, and then the picture that the virtual prop is in the virtual scene is presented in the prop display area of the visual field picture, so that the better immersion and experience feeling can be brought to a user, the user can know the operating state of the virtual prop and the surrounding virtual scene at any time conveniently, multiple interactive operations are not needed, the human-computer interaction efficiency is improved, and the occupation of hardware processing resources is reduced.
Drawings
Fig. 1 is a schematic diagram of an architecture of a screen display system 100 for a virtual scene according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device 500 for a screen display method of a virtual scene according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a human-machine interaction engine installed in a screen display device of a virtual scene according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of a method for displaying a frame of a virtual scene according to an embodiment of the present application;
FIG. 5 is a representation diagram of a virtual object being controlled to enter the interior of a virtual item according to an embodiment of the present application;
FIG. 6 is a schematic view of a view field provided in an embodiment of the present application;
FIG. 7 is a schematic view of a view field provided in an embodiment of the present application;
FIG. 8 is a schematic view of a view field provided in an embodiment of the present application;
FIG. 9 is a rendering schematic diagram of an ambiguous display style provided by an embodiment of the application;
FIG. 10 is a schematic representation of a preset transparency display style provided in an embodiment of the present application;
FIG. 11 is a schematic representation of presentation of attribute indication information of property provided by an embodiment of the present application;
FIG. 12 is a representation of a prop display area provided in an embodiment of the present application;
FIG. 13 is a schematic view of a screen switch provided in an embodiment of the present application;
fig. 14 is a presentation schematic diagram of a display control of a prop display area provided in an embodiment of the present application;
fig. 15 is a schematic view illustrating a process of exiting a virtual item according to an embodiment of the present application;
fig. 16 is a screen display diagram of a virtual scene provided in the related art;
fig. 17 is a flowchart illustrating a method for displaying a frame of a virtual scene according to an embodiment of the present application;
fig. 18 is a schematic structural diagram of a screen display device 555 for a virtual scene according to an embodiment of the present application.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In the following description, references to the terms "first \ second \ third" are only to distinguish similar objects and do not denote a particular order, but rather the terms "first \ second \ third" are used to interchange specific orders or sequences, where appropriate, so as to enable the embodiments of the application described herein to be practiced in other than the order shown or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
1) The terminal comprises a client and an application program running in the terminal and used for providing various services, such as an instant messaging client and a video playing client.
2) In response to the condition or state on which the performed operation depends, one or more of the performed operations may be in real-time or may have a set delay when the dependent condition or state is satisfied; there is no restriction on the order of execution of the operations performed unless otherwise specified.
3) The virtual scene is a virtual scene displayed (or provided) when an application program runs on the terminal. The virtual scene may be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present invention. For example, a virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as deserts, cities, etc., and a user may control a virtual object to move in the virtual scene.
4) Virtual objects, the appearance of various people and objects in the virtual scene that can interact, or movable objects in the virtual scene. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in the virtual scene. The virtual object may be an avatar in the virtual scene that is virtual to represent the user. The virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene and occupying a portion of the space in the virtual scene.
Alternatively, the virtual object may be a user Character controlled by an operation on the client, an Artificial Intelligence (AI) set in the virtual scene fight by training, or a Non-user Character (NPC) set in the virtual scene interaction. Alternatively, the virtual object may be a virtual character that is confrontationally interacted with in a virtual scene. Optionally, the number of virtual objects participating in the interaction in the virtual scene may be preset, or may be dynamically determined according to the number of clients participating in the interaction.
Taking a shooting game as an example, the user may control the virtual object to freely fall, glide, open a parachute to fall, run, jump, crawl over land, or the like in the sky of the virtual scene, or may control the virtual object to swim, float, or dive in the sea. Of course, the user may also control the virtual object to ride the vehicle-like virtual item to move in the virtual scene, for example, the vehicle-like virtual item may be a virtual car, a virtual aircraft, a virtual yacht, or the like; the user may also control the virtual object to perform antagonistic interaction with other virtual objects through the attack-type virtual item, for example, the virtual item may be a virtual machine a, a virtual tank, a virtual fighter, and the like, which is only illustrated in the above scenario, and the embodiment of the present invention is not limited to this specifically.
5) Scene data, representing various features that objects in the virtual scene are exposed to during the interaction, may include, for example, the location of the objects in the virtual scene. Of course, different types of features may be included depending on the type of virtual scene; for example, in a virtual scene of a game, scene data may include a time required to wait for various functions provided in the virtual scene (depending on the number of times the same function can be used within a certain time), and attribute values indicating various states of a game character may include, for example, a life value (also referred to as a red value), a magic value (also referred to as a blue value), a guard value, an electric quantity value, and the like.
Based on the above explanations of terms and terms involved in the embodiments of the present application, the screen display system of a virtual scene provided by the embodiments of the present application is explained below. Referring to fig. 1, fig. 1 is a schematic diagram of an architecture of a screen display system 100 of a virtual scene provided in an embodiment of the present application, in order to support an exemplary application, terminals (exemplary terminals 400-1 and 400-2 are shown) are connected to a server 200 through a network 300, where the network 300 may be a wide area network or a local area network, or a combination of the two, and data transmission is implemented using a wireless or wired link.
The terminal (such as the terminal 400-1 and the terminal 400-2) is used for sending an acquisition request of scene data of the virtual scene to the server 200 based on the view interface receiving the triggering operation of entering the virtual scene;
the server 200 is configured to receive an acquisition request of scene data, and return the scene data of a virtual scene to the terminal in response to the acquisition request;
terminals (such as terminal 400-1 and terminal 400-2) for receiving scene data of a virtual scene, rendering a picture of the virtual scene based on the scene data, and presenting the picture of the virtual scene on a graphical interface (for example, graphical interface 410-1 and graphical interface 410-2 are shown); the virtual scene can also present an object interaction environment, an interaction object and the like in the picture of the virtual scene, and the content presented by the picture of the virtual scene is obtained by rendering based on the returned scene data of the virtual field.
In practical application, the server 200 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network service, cloud communication, a middleware service, a domain name service, a security service, a CDN, a big data and artificial intelligence platform, and the like. The terminals (e.g., terminal 400-1 and terminal 400-2) may be, but are not limited to, a smart phone, a tablet computer, a laptop computer, a desktop computer, a smart speaker, a smart television, a smart watch, and the like. The terminals (e.g., terminal 400-1 and terminal 400-2) and the server 200 may be directly or indirectly connected through wired or wireless communication, and the application is not limited thereto.
In actual applications, the terminals (including the terminal 400-1 and the terminal 400-2) are installed and run with applications supporting virtual scenes. The application program may be any one of a First-Person shooter game (FPS), a third-Person shooter game, a Multi-player Online Battle sports game (MOBA), a Two-dimensional (2D) game application, a Three-dimensional (3D) game application, a virtual reality application program, a Three-dimensional map program, a military simulation program, or a Multi-player gunfight survival game. The application may also be a stand-alone application, such as a stand-alone 3D game program.
The virtual scene involved in the embodiment of the invention can be used for simulating a two-dimensional virtual space or a three-dimensional virtual space and the like. Taking the example that the virtual scene simulates a three-dimensional virtual space, which may be an open space, the virtual scene may be used to simulate a real environment in reality, for example, the virtual scene may include sky, land, sea, and the like, and the land may include environmental elements such as a desert, a city, and the like. Of course, the virtual scene may also include virtual objects, such as buildings, vehicles, and props for arming themselves or weapons required for fighting with other virtual objects. The virtual scene can also be used for simulating real environments in different weathers, such as sunny days, rainy days, foggy days or nights. The virtual object may be an avatar in the virtual scene for representing the user, and the avatar may be in any form, such as a simulated character, a simulated animal, and the like, which is not limited by the invention. In actual implementation, a user may use a terminal (such as terminal 400-1) to control a virtual object to perform activities in the virtual scene, including but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing.
The method comprises the steps that an electronic game scene is taken as an exemplary scene, a user can operate on a terminal in advance, the terminal can download a game configuration file of the electronic game after detecting the operation of the user, the game configuration file can comprise an application program, interface display data or virtual scene data and the like of the electronic game, and therefore the user can call the game configuration file when logging in the electronic game on the terminal and render and display an electronic game interface. A user may perform a touch operation on a terminal, and after the terminal detects the touch operation, the terminal may determine game data corresponding to the touch operation, and render and display the game data, where the game data may include virtual scene data, behavior data of a virtual object in the virtual scene, and the like.
In practical application, a terminal (including the terminal 400-1 and the terminal 400-2) receives a trigger operation for entering a virtual scene based on a view interface, and sends an acquisition request of scene data of the virtual scene to the server 200; the server 200 receives the acquisition request of the scene data, responds to the acquisition request, and returns the scene data of the virtual scene to the terminal; the terminal receives scene data of the virtual scene, renders pictures of the virtual scene based on the scene data, and presents the pictures of the virtual scene;
further, the terminal presents virtual props (such as a first plane for fighting, or a virtual vehicle assisting a virtual object to move, such as a vehicle or an aircraft) in a picture of the virtual scene; responding to a trigger instruction aiming at the virtual item in the virtual scene, and controlling a virtual object (namely an avatar corresponding to a user logging in the electronic game) to enter the inside of the virtual item; responding to an operation instruction of the virtual object for the virtual prop, controlling the virtual prop to be in an operation state, and presenting a view picture when the virtual object is in the virtual prop in the operation state; and in the visual field picture, presenting a prop display area of the virtual prop, and in the prop display area, presenting a picture of the virtual prop in a virtual scene.
The virtual simulation application of military is taken as an exemplary scene, the virtual scene technology is adopted to enable a trainee to experience a battlefield environment in a real way in vision and hearing and to be familiar with the environmental characteristics of a to-be-battle area, necessary equipment is interacted with an object in the virtual environment, and the implementation method of the virtual battlefield environment can create a three-dimensional battlefield environment which is a dangerous image ring life and is almost real through background generation and image synthesis through a corresponding three-dimensional battlefield environment graphic image library comprising a battle background, a battlefield scene, various weaponry, fighters and the like.
In actual implementation, the terminal (including the terminal 400-1 and the terminal 400-2) sends an acquisition request of scene data of a virtual scene to the server 200 based on a trigger operation of entering the virtual scene received by the view interface; the server 200 receives the acquisition request of the scene data, responds to the acquisition request, and returns the scene data of the virtual scene to the terminal; the terminal receives scene data of the virtual scene, renders pictures of the virtual scene based on the scene data, and presents the pictures of the virtual scene;
further, the terminal presents virtual props (such as tanks for fighting, or virtual vehicles assisting a virtual object to move, such as vehicles and aircrafts) in the pictures of the virtual scene; responding to a trigger instruction aiming at the virtual prop in the virtual scene, and controlling a virtual object (namely a simulated fighter in the virtual military simulation scene) to enter the virtual prop; responding to an operation instruction of the virtual object for the virtual prop, controlling the virtual prop to be in an operation state, and presenting a view picture when the virtual object is in the virtual prop in the operation state; and in the visual field picture, presenting a prop display area of the virtual prop, and in the prop display area, presenting a picture of the virtual prop in a virtual scene.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an electronic device 500 for a screen display method of a virtual scene according to an embodiment of the present application. In practical applications, the electronic device 500 may be a server or a terminal shown in fig. 1, and an electronic device that implements the screen display method of a virtual scene according to an embodiment of the present application is described by taking the electronic device 500 as the terminal shown in fig. 1 as an example, where the electronic device 500 includes: at least one processor 510, memory 550, at least one network interface 520, and a user interface 530. The various components in the electronic device 500 are coupled together by a bus system 540. It is understood that the bus system 540 is used to enable communications among the components. The bus system 540 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 540 in fig. 2.
The Processor 510 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose Processor may be a microprocessor or any conventional Processor, or the like.
The user interface 530 includes one or more output devices 531 enabling presentation of media content, including one or more speakers and/or one or more visual display screens. The user interface 530 also includes one or more input devices 532, including user interface components to facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 550 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, and the like. Memory 550 optionally includes one or more storage devices physically located remote from processor 510.
The memory 550 may comprise volatile memory or nonvolatile memory, and may also comprise both volatile and nonvolatile memory. The nonvolatile memory may be a Read Only Memory (ROM), and the volatile memory may be a Random Access Memory (RAM). The memory 550 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 550 can store data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 551 including system programs for processing various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and processing hardware-based tasks;
a network communication module 552 for communicating to other computing devices via one or more (wired or wireless) network interfaces 520, exemplary network interfaces 520 including: bluetooth, wireless compatibility authentication (WiFi), and Universal Serial Bus (USB), etc.;
a presentation module 553 for enabling presentation of information (e.g., a user interface for operating peripherals and displaying content and information) via one or more output devices 531 (e.g., a display screen, speakers, etc.) associated with the user interface 530;
an input processing module 554 to detect one or more user inputs or interactions from one of the one or more input devices 532 and to translate the detected inputs or interactions.
In some embodiments, the picture display device of the virtual scene provided in the embodiments of the present application may be implemented in software, and fig. 2 illustrates a picture display device 555 of the virtual scene stored in a memory 550, which may be software in the form of programs and plug-ins, and includes the following software modules: the control module 5551, the first rendering module 5552 and the second rendering module 5553 are logical and thus may be arbitrarily combined or further divided according to the implemented functions, which will be described below.
In other embodiments, the picture display Device of the virtual scene provided in this embodiment may be implemented by combining hardware and software, and as an example, the picture display Device of the virtual scene provided in this embodiment may be a processor in the form of a hardware decoding processor, which is programmed to execute the picture display method of the virtual scene provided in this embodiment, for example, the processor in the form of the hardware decoding processor may employ one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field Programmable Gate Arrays (FPGAs), or other electronic components.
In some embodiments, a human-computer interaction engine for implementing the image display method of the virtual scene is installed in the image display device 555 of the virtual scene, where the human-computer interaction engine includes a functional module, a component, or a plug-in for implementing the image display method of the virtual scene, fig. 3 is a schematic diagram of the human-computer interaction engine installed in the image display device of the virtual scene provided in the embodiments of the present application, and referring to fig. 3, taking the virtual scene as a game scene as an example, and accordingly, the human-computer interaction engine is a game engine.
The game engine is a code (instruction) set which is designed for a machine running a certain kind of game and can be identified by the machine, and is like an engine and controls the running of the game, a game program can be divided into two parts of the game engine and game resources, the game resources comprise images, sounds, animation and the like, the game is divided into the engine (program code) + resources (images, sounds, animation and the like), and the game engine calls the resources in sequence according to the requirements of the game design.
The screen display method of the virtual scene provided by the embodiment of the present application may be implemented by each module in the screen display device of the virtual scene shown in fig. 2 by calling a relevant module, component or plug-in of the game engine shown in fig. 3, and the following describes an exemplary module, component or plug-in included in the game engine shown in fig. 3.
As shown in fig. 3, includes: 1) the virtual camera is used for presenting the game scene pictures, one game scene at least corresponds to one virtual camera, two or more than two virtual cameras can be used as game rendering windows according to actual needs to capture and present the picture content of the game world for a player, and the viewing angles of the player watching the game world, such as a first person viewing angle and a third person viewing angle, can be adjusted by setting the parameters of the virtual camera.
2) Scene organization, used for game scene management, such as collision detection, visibility elimination, and the like; wherein, the collision detection can be realized by a collision body, and the collision body can be realized by an Axis-Aligned Bounding Box (AABB) or an Oriented Bounding Box (OBB) according to the actual requirement; the visibility elimination can be realized based on a visual body, the visual body is a three-dimensional frame generated according to the virtual camera and is used for cutting objects outside the visual range of the camera, the objects in the visual body are projected to a visual plane, and the objects which are not in the visual body are discarded and not processed.
3) And the terrain management module is used for performing a component for managing the terrain in the game scene, and is used for creating and editing the game terrain, such as creating the terrain in the game scene of mountains, canyons, caves and the like.
4) Editor, an aid in game design, comprising:
the scene editor is used for editing the game scene content, such as changing the terrain, customizing vegetation distribution, lighting layout and the like;
a model editor for making and editing models in a game (character models in a game scene);
the special effect editor is used for editing the special effect in the game picture;
and the action editor is used for defining and editing the action of the character in the game picture.
5) The special effect component is used for making and editing game special effects in game pictures, and can be realized by adopting particle special effects and texture UV animations in practical application; the particle special effect is that countless single particles are combined to present a fixed form, and the whole or single movement of the particles is controlled by the controller and the script, so that the real effects of water, fire, fog, gas and the like are simulated; UV animation is texture animation implemented by dynamically modifying the UV coordinates of the map.
6) The skeleton animation is realized by adopting built-in skeletons to drive an object to move, and can be understood as the following two concepts:
bone: an abstract concept for controlling skinning, such as human skeletal control skin;
covering: factors controlled by the bones and displayed outside, such as the skin of the human body, are affected by the bones.
7) Morph animation: i.e., morphing animation, animation achieved by adjusting the vertices of the base model.
8) And the UI control is used for realizing the control of game picture display.
9) The bottom layer algorithm, the algorithm required to be called for realizing the functions in the game engine, the graphical algorithm required by the real scene organization, and the matrix transformation and the vector transformation required by the skeleton animation are realized.
10) The rendering component is necessary for displaying the game picture effect, and the scene described by the three-dimensional vector is converted into the scene described by the two-dimensional pixel through the rendering component, wherein the scene described by the two-dimensional pixel comprises model rendering and scene rendering.
11) And A, routing, and an algorithm for seeking the shortest path during path planning, routing and graph routing in game design.
For example, the interaction between the user and the game can be realized by calling a UI control in the game engine shown in fig. 3, a two-dimensional or three-dimensional model is created by calling a Morph animation part in the game engine, after the model is created, a material map is given to the model according to different surfaces by the skeleton animation part, which is equivalent to covering the skeleton with skin, and finally, all effects of the model, animation, light shadow, special effect and the like are calculated in real time by the rendering component and displayed on the man-machine interaction interface. Specifically, the control module 5551 may receive a trigger instruction for the virtual item in the virtual scene by calling a UI control in the game engine shown in fig. 3, and in response to the trigger instruction, control the virtual object to enter the inside of the virtual item.
The first presentation module 5552 may receive an operation instruction of the virtual object for the virtual item by calling a UI control in the game engine shown in fig. 3, and in response to the operation instruction, control the virtual item to be in an operation state; then, after rendering the virtual scene data by calling the rendering component in the game engine shown in fig. 3, the view picture when the virtual object is inside the virtual item in the operation state is presented.
The second presenting module 5553 may present, after rendering the virtual scene data by calling a rendering component in the game engine shown in fig. 3, a prop display area of the virtual prop in the view screen, and present, in the prop display area, a screen of the virtual prop in the virtual scene.
Based on the above description of the screen display system and the electronic device for virtual scenes provided in the embodiments of the present application, the screen display method for virtual scenes provided in the embodiments of the present application is described below. In some embodiments, the screen display method of the virtual scene provided in the embodiments of the present application may be implemented by a server or a terminal alone, or implemented by a server and a terminal in a cooperation manner, and the screen display method of the virtual scene provided in the embodiments of the present application is described below with an embodiment of a terminal as an example.
Referring to fig. 4, fig. 4 is a schematic flowchart of a screen display method of a virtual scene provided in an embodiment of the present application, where the screen display method of the virtual scene provided in the embodiment of the present application includes:
step 101: and the terminal responds to a trigger instruction aiming at the virtual prop in the virtual scene and controls the virtual object to enter the interior of the virtual prop.
Here, the terminal is installed with an application client supporting a virtual scene, and when a user opens the application client on the terminal and the terminal runs the application client, the terminal presents a picture of the virtual scene (such as a shooting game scene), which may be a two-dimensional virtual scene or a three-dimensional virtual scene.
In practical application, the terminal presents the virtual props in the virtual scene in the picture of the virtual scene, and the virtual props can be carrier virtual props such as virtual cars, virtual aircrafts and virtual yachts, and can also be attack virtual props such as virtual armatures, virtual tanks and virtual fighters. The terminal receives a trigger instruction aiming at the virtual item, and controls a virtual object to enter the virtual item, such as controlling the virtual object to enter a virtual machine A item, controlling the virtual object to enter a virtual automobile item, and the like, wherein the virtual object is a virtual image of a user of a login account corresponding to the virtual scene, such as the virtual scene is an electronic game scene, and the virtual object is a virtual image corresponding to a player user who logs in the electronic game client.
In some embodiments, the terminal may control the virtual object to enter the inside of the virtual prop by: presenting a usage control for accessing an interior of the virtual item; and controlling the virtual object to enter the inside of the virtual prop in response to a triggering instruction for the virtual prop in the virtual scene based on the triggering of the using control.
Here, the terminal may present a use control for entering the inside of the virtual item, so that the user can control the virtual object to enter the inside of the virtual item through the use control. Specifically, the use control can be presented in a picture of a virtual scene, the presentation can be triggered by clicking the virtual prop, or the presentation can be automatically presented when detecting that the virtual object is located in an induction range of the virtual prop, and a user can trigger a trigger instruction for the virtual prop in the virtual scene by clicking, pressing the use control for a long time, and the like. When the terminal receives a trigger instruction aiming at the virtual prop in the virtual scene based on the trigger of the using control, the terminal responds to the trigger instruction aiming at the virtual prop in the virtual scene and controls the virtual object to enter the interior of the virtual prop.
By way of example, referring to fig. 5, fig. 5 is a schematic representation of controlling a virtual object to enter the inside of a virtual prop according to an embodiment of the present application. Here, the terminal presents a use control "enter" for entering the inside of the virtual item "machine a" in the screen of the virtual scene, as shown in a diagram a in fig. 5; in response to the click operation for using the control "enter", a trigger instruction for the virtual item "machine a" is received, and in response to the trigger instruction, the virtual object is controlled to enter the inside of the virtual item, as shown in fig. 5B, where what is presented is a view screen corresponding to the virtual object when the virtual object enters the inside of the virtual item "machine a", that is, a screen obtained by observing the virtual scene at the first person perspective of the virtual object.
Step 102: and responding to an operation instruction of the virtual object for the virtual prop, controlling the virtual prop to be in an operation state, and presenting a view picture when the virtual object is in the virtual prop in the operation state.
Here, in an actual application, after the terminal controls the virtual object to enter the inside of the virtual item, the terminal may present an operation control corresponding to the virtual item, such as a shooting control, a moving direction control, a jumping control, and the like, and the user may perform operation control on the virtual item through the operation control of the virtual item. When the terminal receives an operation instruction of the virtual object for the virtual prop, the virtual prop is controlled to be in an operation state, and a visual field picture when the virtual object is in the virtual prop in the operation state is presented. In practical applications, the view field is observed from a first-person perspective of the virtual object as an observation perspective.
Step 103: and in the visual field picture, presenting a prop display area of the virtual prop, and in the prop display area, presenting a picture of the virtual prop in a virtual scene.
Here, after presenting the view screen when the virtual object is inside the virtual item, the terminal also presents the item display area of the virtual item in the view screen. The prop display area is used for displaying a picture of the virtual prop in the virtual scene, so that a user can view the picture of the virtual prop in the virtual scene, namely the scene and the picture around the virtual prop, in real time through the prop display area while viewing the view picture of the virtual object in the virtual prop. Specifically, the prop display area may be independent of the view screen, that is, in a picture-in-picture mode, and the prop display area may move based on operations such as dragging by a user; the item display area may also be fixed in a certain position, and may be a part of a virtual item, for example, a screen display control of the virtual item exists.
As an example, referring to fig. 6, fig. 6 is a schematic view of a view screen provided in an embodiment of the present application, where the view screen is a view screen when a virtual object is inside a virtual item "machine a", a prop display area of the virtual prop, that is, an "machine a" screen display control is also displayed in the view screen, and a screen of the virtual prop "machine a" in a virtual scene is displayed through the prop display area.
In some embodiments, the view frame includes: a first sub-picture corresponding to the inside of the virtual prop and a second sub-picture corresponding to the outside of the virtual prop; correspondingly, the terminal can present the item display area of the virtual item in the following way: and displaying a prop display area of the virtual prop in a first sub-picture corresponding to the interior of the virtual prop.
In practical application, after the terminal controls the virtual object to enter the interior of the virtual prop, the displayed view picture of the virtual object in the interior of the virtual prop comprises a first sub-picture corresponding to the interior of the virtual prop and a second sub-picture corresponding to the exterior of the virtual prop. Based on the above, the terminal can present the item display area of the virtual item in the first sub-picture inside the corresponding virtual item. By way of example, referring to fig. 7, fig. 7 is a presentation schematic diagram of a view field screen provided by an embodiment of the present application. Here, the view screen is a view screen when the virtual object is inside the virtual item "machine a", and includes a first sub-screen a1 corresponding to the inside of the virtual item "machine a" and a second sub-screen a2 corresponding to the outside of the virtual item "machine a", and at the same time, in the first sub-screen a1 corresponding to the inside of the virtual item "machine a", an item display area A3 of the virtual item, that is, an item screen display control of the "machine a", is presented.
In some embodiments, the terminal may present the view picture when the virtual object is inside the virtual prop by: when the virtual object is in the virtual prop in the operating state, observing a view picture obtained by taking the first person view angle as an observation view angle; correspondingly, the terminal can present the picture of the virtual item in the virtual scene in the following ways: and in the prop display area, adopting a third person weighing view angle to display a picture of the virtual prop in the virtual scene.
Here, in an actual application, the view screen when the virtual object presented by the terminal is inside the virtual item is a view screen observed based on the first person viewing angle of the virtual object as the observation viewing angle when the virtual object is inside the virtual item in the operation state. In the application, the terminal presents the picture of the virtual prop in the virtual scene by adopting the third person weighing visual angle through the prop display area in the visual field picture when the virtual object is inside the virtual prop, so that a user can check the full view of the virtual prop when controlling the virtual prop to interact, and know the current environment and surrounding conditions of the virtual prop at any time. As an example, referring to fig. 8, fig. 8 is a presentation schematic diagram of a view field screen provided in an embodiment of the present application. Here, the view screen is a view screen when the virtual object is inside the virtual item "machine a", and the view screen is a view screen observed based on the first person view angle of the virtual object as the observation view angle when the virtual object is inside the virtual item in the operating state; and in a prop display area of the virtual prop displayed in the visual field picture, namely a first screen display control of the first machine, adopting a third person named a visual angle to display a picture of the virtual prop in the virtual scene.
In some embodiments, the terminal may present the picture that the virtual item is in the virtual scene by: in the item display area, adopting a display style corresponding to the item attribute of the virtual item to present a picture of the virtual item in a virtual scene; wherein the property attribute comprises at least one of: and the electric quantity attribute and the protection attribute of the virtual prop.
In practical applications, the virtual item may include various item attributes, such as an electric quantity attribute, a protection attribute, and an attack attribute of the virtual item. Therefore, corresponding display styles can be set for different property attributes, so that when a picture of the virtual property in the virtual scene is presented through the property display area, the picture of the virtual property in the virtual scene can be presented through the display styles corresponding to the property attributes. Specifically, different property attributes have corresponding property attribute values, and a presentation rule of a display style may be set, for example, when a property attribute value of a property attribute is in a preset attribute value interval, a display style corresponding to the property attribute may be used for displaying, such as a fuzzy display style, a preset transparency display style, and the like.
In some embodiments, the terminal may present a picture of the virtual item in the virtual scene by adopting a display style corresponding to the item attribute of the virtual item as follows: when the property is the protection property, obtaining the protection property value of the virtual property; and when the protection attribute value belongs to the target protection attribute value interval, adopting a fuzzy display style to present a picture of the virtual prop in the virtual scene.
In practical applications, when the track attribute is the guard attribute, a target guard attribute value interval, such as 0-25% (i.e., the current guard attribute value is less than 25% and greater than 0% of the total guard attribute value) may be preset. When the protection attribute value of the obtained virtual item is in the target protection attribute value interval, for example, between 0% and 25%, a fuzzy display style can be adopted to present a picture of the virtual item in the virtual scene. Specifically, the fuzzy display mode is to perform fuzzy processing, such as gaussian fuzzy processing, on the picture of the virtual item in the virtual scene, so as to display the picture of the virtual item in the virtual scene after the fuzzy processing. For example, referring to fig. 9, fig. 9 is a schematic presentation diagram of a fuzzy display style provided in an embodiment of the present application, where a picture of a virtual item in a virtual scene is subjected to gaussian fuzzy processing, and then a picture of the virtual item in the virtual scene after the fuzzy processing is presented, that is, the presented picture of the virtual item in the virtual scene has a fuzzy display effect.
In some embodiments, the terminal may present a picture of the virtual item in the virtual scene by adopting a display style corresponding to the item attribute of the virtual item as follows: when the property attribute is an electric quantity attribute, acquiring an electric quantity attribute value of the virtual property; and when the electric quantity attribute value belongs to the target electric quantity attribute value interval, displaying a style by adopting a preset transparency, and presenting a picture of the virtual prop in the virtual scene.
In practical applications, when the attribute of the road is the electric quantity attribute, a target electric quantity attribute value interval, such as 0-25% (that is, the current electric quantity attribute value is less than 25% and greater than 0% of the total electric quantity attribute value) may be preset. When the electric quantity attribute value of the obtained virtual item is within the target electric quantity attribute value interval, for example, between 0% and 25%, a preset transparency display style can be adopted to present a picture of the virtual item in the virtual scene. Specifically, the preset transparency display style is to use a preset transparency, for example, 60% transparency to present a picture of the virtual item in the virtual scene, that is, the presented picture of the virtual item in the virtual scene has a display effect of the preset transparency. As an example, referring to fig. 10, fig. 10 is a schematic presentation diagram of a preset transparency display style provided in an embodiment of the present application, where a virtual item presented in an item display area is a picture in a virtual scene, and has a display effect of 40% transparency.
In some embodiments, the terminal may present, in the item display area, attribute indication information corresponding to an item attribute of the virtual item; wherein the property attribute comprises at least one of: the electric quantity attribute and the protection attribute of the virtual prop; the attribute indication information is used for indicating the available state corresponding to the property of the prop.
In practical application, the terminal may further present, in the item display area, attribute indication information corresponding to an item attribute of the virtual item, where the attribute indication information is used to indicate an available state corresponding to the item attribute, and the attribute indication information specifically may include: attribute indicating information used for indicating the available state corresponding to the protection attribute and attribute indicating information used for indicating the available state corresponding to the electric quantity attribute. By way of example, referring to fig. 11, fig. 11 is a schematic presentation diagram of attribute indication information of an attribute of a prop provided in an embodiment of the present application. Here, attribute indication information B1 for indicating an available state corresponding to the guard attribute and attribute indication information B2 for indicating an available state corresponding to the power amount attribute are included, where the attribute indication information also indicates how many attribute values of the respective attributes are; the protection attribute value corresponding to the protection attribute is 100%, the available state is available, the electric quantity attribute value corresponding to the electric quantity attribute is 15%, and the available state is available.
In some embodiments, the terminal may present a hidden control for the prop display area in the view screen; correspondingly, after the prop display area of the virtual prop is presented, the prop display area can be hidden in the following modes: receiving a region hiding instruction aiming at a prop display region triggered based on a hidden control; and hiding the prop display area of the presented virtual prop in response to the area hiding instruction.
Here, after the terminal presents the prop display area in the view screen, a hidden control corresponding to the prop display area may also be presented in the view screen, so that the user hides the presented prop display area based on the hidden control. When the terminal receives a region hiding instruction aiming at the prop display region triggered based on the hiding control, the terminal hides the prop display region of the presented virtual prop in response to the region hiding instruction.
In some embodiments, the terminal may present, in the view screen, an area expansion control corresponding to the item display area; correspondingly, after hiding the item display area of the presented virtual item, the terminal can display the hidden item display area in the following way: receiving a region expansion instruction aiming at a prop display region, which is triggered based on a region expansion control; and responding to the area expansion instruction, and presenting a prop display area of the hidden virtual prop.
Here, after the terminal hides the item display area of the presented virtual item, an area expansion control corresponding to the item display area may be presented in the view screen, so that the user expands the item display area in the hidden state based on the area expansion control, and the item display area is presented again. When the terminal receives a region expansion instruction aiming at the prop display region triggered based on the region expansion control, the terminal responds to the region expansion instruction and presents the hidden prop display region of the virtual prop.
By way of example, referring to fig. 12, fig. 12 is a schematic representation of a prop display area provided in an embodiment of the present application. Here, the terminal presents a prop display area of the virtual prop in the view screen, and presents a hidden control "retract" corresponding to the prop display area, as shown in a diagram a in fig. 12; in response to a region hiding instruction triggered based on the hidden control "retract", hiding a prop display region of the presented virtual prop, and presenting a region expansion control "expand" corresponding to the prop display region, as shown in fig. 12B; and in response to a region expansion instruction aiming at the prop display region triggered based on the region expansion control expansion, presenting the prop display region of the hidden virtual prop, as shown in a C diagram in fig. 12.
In some embodiments, the terminal may switch the content presented by the home screen to the content presented by the item display area by: and responding to the trigger operation aiming at the item display area, and switching the visual field picture of the presented virtual object when the virtual item is in the virtual item into the picture of the virtual item when the virtual item is in the virtual scene.
In the above embodiment, the terminal presents the view picture of the virtual object when the virtual object is inside the virtual item through the home picture, and presents the picture of the virtual item in the virtual scene through the item display area of the view picture. In practical application, the terminal can also switch the content presented by the main picture into the content presented by the item display area. Specifically, in response to a trigger operation (such as a long-press operation, a click operation, and the like) for the item display area, a view screen when the virtual object presented in the original home screen is inside the virtual item is switched to a screen when the virtual item presented in the item display area is in the virtual scene, so as to present a screen when the virtual item is in the virtual scene through the home screen.
In some embodiments, after the terminal switches the presented main picture to a picture in which the virtual item is in the virtual scene, a sub-picture independent of the picture may also be presented, and in the sub-picture, a view picture corresponding to the outside of the virtual item when the virtual object is inside the virtual item is presented.
Here, after the terminal switches the presented main picture to a picture in which the virtual item is in the virtual scene, another sub-picture independent of the picture may be presented, and in the sub-picture, a view picture corresponding to the outside of the virtual item when the virtual object is inside the virtual item is presented, so as to improve the experience of the user.
As an example, referring to fig. 13, fig. 13 is a presentation schematic diagram of screen switching provided in an embodiment of the present application. Here, the terminal presents a view screen when the virtual object is inside the virtual item through the home screen C1, and presents a screen when the virtual item is in the virtual scene through an item display area of the view screen, as shown in a diagram in fig. 13; in response to a trigger operation (such as a long press operation, a click operation, and the like) for the item display area, switching a view screen when the virtual object presented in the home screen C1 is inside the virtual item to a screen when the virtual item presented in the item display area is in the virtual scene, so as to present a screen when the virtual item is in the virtual scene through the home screen C1;
meanwhile, the terminal also presents a sub-screen C2 independent of the main screen, and in the sub-screen C2, presents a view screen corresponding to the outside of the virtual item when the virtual object is inside the virtual item, as shown in B in fig. 13.
In some embodiments, the terminal may present a display control corresponding to the prop display area; responding to the trigger operation aiming at the display control, and controlling the state of the display control to be an opening state; correspondingly, when the display control is in the opening state, a prop display area of the virtual prop is presented in the visual field picture.
Here, the terminal may further present a display control corresponding to the prop display area, so as to control the display and the closing of the prop display area based on the display control. And responding to the triggering operation aiming at the display control, controlling the state of the display control to be an opening state, and displaying a prop display area of the virtual prop in the visual field picture at the moment. By way of example, referring to fig. 14, fig. 14 is a presentation schematic diagram of a display control of a prop display area provided in an embodiment of the present application. Here, the terminal presents a display control corresponding to the property display area, and the state of the display control is an off state, at this time, the property display area is not presented in the view screen, as shown in a diagram in fig. 14; in response to the trigger operation for the display control, the state of the display control is controlled to be an open state, and at this time, in the view screen, a prop display area of the virtual prop is presented, as shown in fig. 14B.
In some embodiments, the terminal may control the virtual object to exit from inside the virtual item by: acquiring a property attribute value corresponding to the property attribute of the virtual property; when the property value of the prop is lower than the property value threshold value, controlling the virtual object to exit from the interior of the virtual prop and canceling the prop display area of the presented virtual prop; wherein the property attribute comprises at least one of: the electric quantity attribute and the protection attribute of the virtual prop;
in practical application, the virtual prop includes a plurality of prop attributes such as a protection attribute, an electric quantity attribute and the like, and each prop attribute has a corresponding prop attribute value. And when the property value of the virtual property is lower than the property value threshold value or zero, controlling the virtual object to automatically exit from the interior of the virtual property, finishing the control of the virtual property, and simultaneously canceling the property display area of the presented virtual property.
In some embodiments, the terminal may control the virtual object to exit from inside the virtual item by: presenting an exit control corresponding to the virtual prop in a visual field picture; correspondingly, in response to the triggering operation aiming at the quitting control, the virtual object is controlled to quit from the interior of the virtual prop, and the prop display area of the presented virtual prop is cancelled.
In practical application, the terminal can also present an exit control corresponding to the virtual item in the view screen, so that when a user wants to exit the interior of the virtual item, the user can control the virtual object to exit from the interior of the virtual item based on the exit control. And the terminal responds to the triggering operation aiming at the exit control, controls the virtual object to exit from the interior of the virtual prop, and cancels the prop display area of the presented virtual prop. Referring to fig. 15, fig. 15 is a schematic view illustrating a process of exiting a virtual item according to an embodiment of the present application. Here, the terminal presents the exit control "exit" corresponding to the virtual item and the corresponding item display area in the view screen, as shown in a in fig. 15; and in response to the triggering operation of the exit control for exiting, controlling the virtual object to exit from the inside of the virtual prop and canceling the prop display area of the presented virtual prop, as shown in a diagram B in FIG. 15.
By applying the embodiment of the application, when the virtual object is controlled to enter the inside of the virtual prop and the virtual prop is controlled to be in the operation state, the view picture of the virtual object in the inside of the virtual prop in the operation state is presented, and the prop display area of the virtual prop is presented in the view picture, so that the picture of the virtual prop in the virtual scene is presented through the prop display area. Therefore, the visual field picture when the virtual object is in the virtual prop in the operating state is presented, and then the picture that the virtual prop is in the virtual scene is presented in the prop display area of the visual field picture, so that the better immersion and experience feeling can be brought to a user, the user can know the operating state of the virtual prop and the surrounding virtual scene at any time conveniently, multiple interactive operations are not needed, the human-computer interaction efficiency is improved, and the occupation of hardware processing resources is reduced.
An exemplary application of the embodiments of the present application in a practical application scenario will be described below.
In a virtual scene (such as an electronic game scene), after a player controls a virtual object to enter the inside of a virtual item (such as an airplane A), a terminal screen is displayed as a visual field picture of a first person visual angle of the virtual object inside the virtual item (such as the airplane A item). The player can control the relevant operations such as the movement, attack of the virtual item 'machine A', the virtual item 'machine A' model can generate synchronous machine A action with the player operation, and in practical application, the machine A action comprises the following steps: forward, backward, left, right walking movements, steering, jumping, switching weapons, weapon firing movements, and weapon bullet changing movements, impact movements, etc.
Referring to fig. 16, fig. 16 is a schematic view of a screen display of a virtual scene provided in the related art, when a player operates a virtual item, a terminal displays a view screen of the virtual scene at a first person view angle on a full screen, and the player can only view a local virtual item model and a local airplane model part through a limited view field size. The visual characteristics of the first-person visual angle can bring better introduction feeling, but the limited visual field can prevent the player from seeing the full view of the airplane model and the complete airplane action when the airplane moves and fights, so that the player has weak perception and dynamic feeling of the player on the airplane and monotonous perception experience.
Based on this, embodiments of the present application provide a method for displaying a virtual scene, so as to at least solve the above existing problems. In this application, the bottom of the virtual item (e.g., item holder) has a specially designed screen display area (i.e., the item display area provided in the above embodiment), and the screen display area transmits the actions of the item model captured in real time by the camera (the special point-hanging camera following the virtual item of the player) in the virtual scene to the player in a screen display manner, that is, the player can completely see the real-time actions of the item holder at the screen display area from the perspective of the third person, and the actions are completely synchronous with the operating state of the player on the item holder.
Taking the virtual prop as an airplane first prop as an example, next, the method for displaying the picture of the virtual scene provided by the embodiment of the application is firstly performed from the product side. The airplane armor prop is a special humanoid advanced weapon device in a game virtual scene, has a protection attribute and an electric quantity attribute, and can be actively and interactively used in a battle. In particular, the amount of the solvent to be used,
firstly, after a player controls a virtual object to enter an airplane A prop, the player starts to operate the airplane A prop to enter an airplane A operation state. At this time, the game visual angle is changed from a third person visual angle of the virtual object to a first person visual angle of the property A, the terminal adopts the first person visual angle to present a view picture of the virtual object in the property A in a view interface, and independently presents a property A screen display of the third person visual angle in a HUD bottom area of the property A, a screen display special camera starts to shoot with the random property A and displays the random property A in the screen display, and a player can check a full view of the property A model and a synchronous real-time image of a surrounding environment scene in the bottom UI screen display, as shown in FIG. 6;
secondly, when the protection attribute value of the airplane armor prop is less than 25% of the total amount and is not 0, changing the content displayed in the screen display area of the airplane armor prop into a Gaussian fuzzy state, as shown in fig. 9;
thirdly, when the electric quantity attribute value of the property of the airplane armor is less than 25% of the total amount and is not 0, changing the content displayed in the screen display area of the property of the airplane armor into a low transparency state, as shown in fig. 10;
fourthly, when the protection attribute value/electric quantity attribute value of the property of the machine A is equal to 0, the player automatically quits the property of the machine A, and the display of the screen display area of the property.
Next, continuing to technically perform the screen display method for the virtual scene provided in the embodiment of the present application, referring to fig. 17, fig. 17 is a schematic flowchart of the screen display method for the virtual scene provided in the embodiment of the present application, and includes:
step 201: and controlling the virtual object to enter the interior of the airplane nail prop in response to the clicking operation of the 'enter' button aiming at the airplane nail prop.
Step 202: and a special camera for activating the prop screen display area of the airplane prop.
Here, after it is detected that the player clicks an enter button of the property a, the camera dedicated to the screen display of the property a is activated, a real-time picture of a third person photographed by the camera is displayed in a property screen display area at the bottom of the UI, and a real-time picture of a first person viewing angle photographed by the camera is displayed outside the property screen display area at the bottom of the UI as usual.
Step 203: and (3) judging whether the protection attribute value of the airplane A prop is less than 25% of the total value of the protection attribute, if not, executing step 207, and if so, executing step 204.
Step 204: and (3) judging whether the protection attribute value of the airplane first prop is 0, if not, executing step 209, and if so, executing step 212.
Step 205: and judging whether the electric quantity attribute value of the airplane A prop is less than 25% of the total electric quantity attribute value, if not, executing a step 208, and if so, executing a step 206.
Step 206: and judging whether the electric quantity attribute value of the airplane first prop is 0, if not, executing a step 210, and if so, executing a step 212.
Step 207: and displaying a real-time picture shot by the special camera for the property screen of the airplane armor through the property screen display area (namely the property display area in the embodiment).
Here, the real-time picture is a picture of the property of the airplane a in the virtual scene obtained by observing through the third person called viewing angle, that is, the picture of the property of the airplane a in the virtual scene is presented through the property screen display area of the property of the airplane a by using the third person called viewing angle.
Step 208: and displaying a real-time picture shot by the special camera for the property screen display of the property A through the property screen display area of the property screen display of the property A.
Step 209: and displaying a real-time picture shot by the special camera for the first prop screen display after Gaussian blur processing through the prop screen display area of the first prop.
Here, in practical application, in the item screen display area of the item of the airplane, a corresponding picture is displayed by a gaussian fuzzy screen art effect. Specifically, the screen art effect of displaying the gaussian blur (the value of the gaussian blur radius can be 8px) can be called to the armored screen display.
Step 210: and displaying a real-time picture shot by the special camera for the property screen display of the machine A with preset transparency through the property screen display area of the property screen display of the machine A.
Here, in practical applications, in the item screen display area of the item of the airplane, a corresponding picture is displayed by a screen art effect of low transparency. Specifically, a screen art effect showing low transparency (opacity value may be 55%) may be invoked on the set-top screen display.
Step 211: and receiving click operation of an exit button aiming at the airplane nail prop.
Step 212: and canceling the displayed prop screen display area.
When detecting that the protection attribute value or the electric quantity attribute value of the property of the airplane armor is 0, canceling the presentation in a property screen display area at the bottom of the UI; and when the clicking operation that the player clicks an exit button of the airplane nail prop is detected, the prop screen display area at the bottom of the UI cancels the presentation.
By applying the embodiment of the application, the screen display mode (picture-in-picture screen display) of the visual angle of the third person which is visible and dynamic is added into the visual field picture of the original visual angle of the first person, the full appearance of the first virtual prop model and the actions of the relevant model can be displayed for the player, the appearance and interactive feeling experience advantages of the visual angle of the third person are integrated on the basis of the feeling brought by the visual angle of the first person, the man-machine interaction inductance and the technological feeling can be greatly improved, and the appearance experience is more vivid and rich.
Continuing with the description of the image display device 555 for virtual scenes provided in the embodiments of the present application, in some embodiments, the image display device for virtual scenes may be implemented by using software modules. Referring to fig. 18, fig. 18 is a schematic structural diagram of a picture display device 555 for a virtual scene provided in an embodiment of the present application, where the picture display device 555 for a virtual scene provided in the embodiment of the present application includes:
the control module 5551 is configured to, in response to a trigger instruction for a virtual item in a virtual scene, control a virtual object to enter the inside of the virtual item;
a first presentation module 5552, configured to control, in response to an operation instruction of the virtual object for the virtual item, that the virtual item is in an operation state, and present, in the operation state, a view screen when the virtual object is inside the virtual item;
a second presenting module 5553, configured to present, in the view screen, a prop display area of the virtual prop, and present, in the prop display area, a screen of the virtual prop in the virtual scene.
In some embodiments, the first presenting module 5552 is further configured to present, in the operating state, a view picture observed from a first person view as an observation view when the virtual object is inside the virtual item;
correspondingly, the second presenting module 5553 is further configured to present, in the item display area, a picture of the virtual item in the virtual scene by using a third person weighing view angle.
In some embodiments, the view frame comprises: a first sub-picture corresponding to the inside of the virtual prop and a second sub-picture corresponding to the outside of the virtual prop; the second presenting module 5553 is further configured to present, in a first sub-screen corresponding to the inside of the virtual item, an item display area of the virtual item.
In some embodiments, the second presenting module 5553 is further configured to present, in the item display area, a picture of the virtual item in the virtual scene in a display style corresponding to an item attribute of the virtual item;
wherein the prop attributes include at least one of: and the electric quantity attribute and the protection attribute of the virtual prop.
In some embodiments, the second presenting module 5553 is further configured to, when the property is a protection property, obtain a protection property value of the virtual property;
and when the protection attribute value belongs to a target protection attribute value interval, presenting a picture of the virtual prop in the virtual scene by adopting a fuzzy display style.
In some embodiments, the second presenting module 5553 is further configured to, when the property is an electric quantity property, obtain an electric quantity property value of the virtual property;
and when the electric quantity attribute value belongs to the target electric quantity attribute value interval, adopting a preset transparency display style to present a picture of the virtual prop in the virtual scene.
In some embodiments, the second presenting module 5553 is further configured to present a hidden control for the prop display area in the view screen;
correspondingly, the second presenting module 5553 is further configured to receive a region hiding instruction for the prop display region, where the region hiding instruction is triggered based on the hiding control;
and hiding the displayed prop display area of the presented virtual prop in response to the area hiding instruction.
In some embodiments, the second presenting module 5553 is further configured to present, in the view screen, an area expansion control corresponding to the prop display area;
correspondingly, the second presenting module 5553 is further configured to receive a region expansion instruction for the prop display region, where the region expansion instruction is triggered based on the region expansion control;
and responding to the area expansion instruction, and presenting a hidden prop display area of the virtual prop.
In some embodiments, the apparatus further comprises:
and the switching module is used for responding to the triggering operation aiming at the prop display area, and switching the displayed visual field picture when the virtual object is positioned in the virtual prop into the picture when the virtual prop is positioned in the virtual scene.
In some embodiments, the switching module is further configured to present a sub-picture independent of the picture, and
and presenting a visual field picture corresponding to the outside of the virtual prop when the virtual object is positioned in the virtual prop in the sub-picture.
In some embodiments, the apparatus further comprises:
the third presentation module is used for presenting a display control corresponding to the prop display area;
responding to the trigger operation aiming at the display control, and controlling the state of the display control to be an opening state;
correspondingly, the second presentation module is further configured to present, in the view screen, a prop display area of the virtual prop when the state of the display control is the open state.
In some embodiments, the control module 5551 is further configured to obtain a property attribute value corresponding to a property attribute of the virtual property; wherein the prop attributes include at least one of: the electric quantity attribute and the protection attribute of the virtual prop;
and when the property value of the prop is lower than the property value threshold value, controlling the virtual object to exit from the interior of the virtual prop and canceling the prop display area of the virtual prop.
In some embodiments, the second presenting module 5553 is further configured to present, in the view screen, an exit control corresponding to the virtual item;
correspondingly, the control module 5551 is further configured to, in response to a triggering operation for the exit control, control the virtual object to exit from the inside of the virtual item, and cancel the item display area of the virtual item that is presented.
In some embodiments, the control module 5551 is further configured to present a use control for accessing the interior of the virtual item;
and responding to a triggering instruction aiming at the virtual prop in the virtual scene based on the triggering of the use control, and controlling the virtual object to enter the inside of the virtual prop.
In some embodiments, the second presenting module 5553 is further configured to present, in the item display area, attribute indication information corresponding to an item attribute of the virtual item;
wherein the prop attributes include at least one of: the electric quantity attribute and the protection attribute of the virtual prop; and the attribute indication information is used for indicating the available state corresponding to the property of the prop.
By applying the embodiment of the application, when the virtual object is controlled to enter the inside of the virtual prop and the virtual prop is controlled to be in the operation state, the view picture of the virtual object in the inside of the virtual prop in the operation state is presented, and the prop display area of the virtual prop is presented in the view picture, so that the picture of the virtual prop in the virtual scene is presented through the prop display area. Therefore, the visual field picture when the virtual object is in the virtual prop in the operating state is presented, and then the picture that the virtual prop is in the virtual scene is presented in the prop display area of the visual field picture, so that the better immersion and experience feeling can be brought to a user, the user can know the operating state of the virtual prop and the surrounding virtual scene at any time conveniently, multiple interactive operations are not needed, the human-computer interaction efficiency is improved, and the occupation of hardware processing resources is reduced.
An embodiment of the present application further provides an electronic device, where the electronic device includes:
a memory for storing executable instructions;
and the processor is used for realizing the picture display method of the virtual scene provided by the embodiment of the application when the executable instructions stored in the memory are executed.
Embodiments of the present application also provide a computer program product or a computer program comprising computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instruction from the computer-readable storage medium, and executes the computer instruction, so that the computer device executes the screen display method of the virtual scene provided by the embodiment of the application.
The embodiment of the present application further provides a computer-readable storage medium, which stores executable instructions, and when the executable instructions are executed by a processor, the method for displaying the virtual scene image provided by the embodiment of the present application is implemented.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, executable instructions may be written in any form of programming language (including compiled or interpreted languages), in the form of programs, software modules, scripts or code, and may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext Markup Language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
The above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present application are included in the protection scope of the present application.

Claims (18)

1. A method for displaying a frame of a virtual scene, the method comprising:
responding to a trigger instruction aiming at a virtual item in a virtual scene, and controlling a virtual object to enter the inside of the virtual item;
responding to an operation instruction of the virtual object for the virtual prop, controlling the virtual prop to be in an operation state, and presenting a visual field picture when the virtual object is in the virtual prop in the operation state;
in the visual field picture, presenting a prop display area of the virtual prop, and
and in the item display area, presenting a picture of the virtual item in the virtual scene.
2. The method of claim 1, wherein presenting a view screen when a virtual object is inside the virtual item in the operating state comprises:
when the virtual object is in the virtual prop in the operating state, a view picture obtained by observing with the first person view as an observation view is presented;
correspondingly, in the item display area, presenting a picture of the virtual item in the virtual scene includes:
and in the prop display area, adopting a third person weighing view angle to display a picture of the virtual prop in the virtual scene.
3. The method of claim 1, wherein the view frame comprises: a first sub-picture corresponding to the inside of the virtual prop and a second sub-picture corresponding to the outside of the virtual prop;
the displaying area of the item presenting the virtual item in the visual field picture comprises:
and displaying a prop display area of the virtual prop in a first sub-picture corresponding to the interior of the virtual prop.
4. The method of claim 1, wherein said presenting, in the item display area, a picture of the virtual item in the virtual scene comprises:
in the item display area, adopting a display style corresponding to the item attribute of the virtual item to present a picture of the virtual item in the virtual scene;
wherein the prop attributes include at least one of: and the electric quantity attribute and the protection attribute of the virtual prop.
5. The method of claim 4, wherein said presenting a picture of the virtual item in the virtual scene in a display style corresponding to the item attribute of the virtual item comprises:
when the property is a protection property, obtaining a protection property value of the virtual property;
and when the protection attribute value belongs to a target protection attribute value interval, presenting a picture of the virtual prop in the virtual scene by adopting a fuzzy display style.
6. The method of claim 4, wherein said presenting a picture of the virtual item in the virtual scene in a display style corresponding to the item attribute of the virtual item comprises:
when the property attribute is an electric quantity attribute, acquiring an electric quantity attribute value of the virtual property;
and when the electric quantity attribute value belongs to the target electric quantity attribute value interval, adopting a preset transparency display style to present a picture of the virtual prop in the virtual scene.
7. The method of claim 1, wherein the method further comprises:
presenting a hidden control aiming at the prop display area in the visual field picture;
correspondingly, after the item display area of the virtual item is presented in the view screen, the method further includes:
receiving a region hiding instruction aiming at the prop display region and triggered based on the hiding control;
and hiding the displayed prop display area of the presented virtual prop in response to the area hiding instruction.
8. The method of claim 7, wherein the method further comprises:
presenting a region expansion control corresponding to the prop display region in the view picture;
correspondingly, after hiding the item display area of the presented virtual item, the method further includes:
receiving a region expansion instruction aiming at the prop display region and triggered based on the region expansion control;
and responding to the area expansion instruction, and presenting a hidden prop display area of the virtual prop.
9. The method of claim 1, wherein the method further comprises:
and responding to the triggering operation aiming at the item display area, and switching the displayed visual field picture when the virtual object is in the virtual item into the picture when the virtual item is in the virtual scene.
10. The method of claim 9, wherein the switch is made after the picture of the virtual item in the virtual scene, the method further comprising:
presenting a sub-picture independent of said picture, and
and presenting a visual field picture corresponding to the outside of the virtual prop when the virtual object is positioned in the virtual prop in the sub-picture.
11. The method of claim 1, wherein the method further comprises:
presenting a display control corresponding to the prop display area;
responding to the trigger operation aiming at the display control, and controlling the state of the display control to be an opening state;
correspondingly, in the view screen, presenting a prop display area of the virtual prop includes:
and when the display control is in an open state, displaying a prop display area of the virtual prop in the visual field picture.
12. The method of claim 1, wherein the method further comprises:
acquiring a property attribute value corresponding to the property attribute of the virtual property; wherein the prop attributes include at least one of: the electric quantity attribute and the protection attribute of the virtual prop;
and when the property value of the prop is lower than the property value threshold value, controlling the virtual object to exit from the interior of the virtual prop and canceling the prop display area of the virtual prop.
13. The method of claim 1, wherein the method further comprises:
presenting an exit control corresponding to the virtual prop in the visual field picture;
correspondingly, after presenting the item display area of the virtual item, the method further includes:
and responding to the triggering operation aiming at the exit control, controlling the virtual object to exit from the inside of the virtual prop, and canceling the prop display area of the virtual prop.
14. The method of claim 1, wherein the controlling the virtual object to enter the interior of the virtual item in response to the triggering instruction for the virtual item in the virtual scene comprises:
presenting a usage control for accessing an interior of the virtual item;
and responding to a triggering instruction aiming at the virtual prop in the virtual scene based on the triggering of the use control, and controlling the virtual object to enter the inside of the virtual prop.
15. The method of claim 1, wherein the method further comprises:
presenting attribute indication information corresponding to the property of the virtual property in the property display area;
wherein the prop attributes include at least one of: the electric quantity attribute and the protection attribute of the virtual prop; and the attribute indication information is used for indicating the available state corresponding to the property of the prop.
16. A picture display apparatus for a virtual scene, the apparatus comprising:
the control module is used for responding to a trigger instruction aiming at a virtual item in a virtual scene and controlling a virtual object to enter the interior of the virtual item;
the first presentation module is used for responding to an operation instruction of the virtual object for the virtual prop, controlling the virtual prop to be in an operation state, and presenting a view picture when the virtual object is in the virtual prop in the operation state;
and the second presentation module is used for presenting a prop display area of the virtual prop in the view picture and presenting a picture of the virtual prop in the virtual scene in the prop display area.
17. An electronic device, characterized in that the electronic device comprises:
a memory for storing executable instructions;
a processor for implementing a method of displaying a picture of a virtual scene as claimed in any one of claims 1 to 15 when executing executable instructions stored in said memory.
18. A computer-readable storage medium storing executable instructions for implementing a method for displaying a picture of a virtual scene according to any one of claims 1 to 15 when executed.
CN202110301140.2A 2021-03-22 2021-03-22 Picture display method and device of virtual scene, electronic equipment and storage medium Active CN112870694B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110301140.2A CN112870694B (en) 2021-03-22 2021-03-22 Picture display method and device of virtual scene, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110301140.2A CN112870694B (en) 2021-03-22 2021-03-22 Picture display method and device of virtual scene, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112870694A true CN112870694A (en) 2021-06-01
CN112870694B CN112870694B (en) 2023-07-21

Family

ID=76041593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110301140.2A Active CN112870694B (en) 2021-03-22 2021-03-22 Picture display method and device of virtual scene, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112870694B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112090073A (en) * 2020-09-27 2020-12-18 网易(杭州)网络有限公司 Game display method and device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112090073A (en) * 2020-09-27 2020-12-18 网易(杭州)网络有限公司 Game display method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
爱吃瓜的小橘: "这就是动力机甲嘛,爱了爱了(⑉°з°)-♡", 《哔哩哔哩:网址:HTTPS://WWW.BILIBILI.COM/VIDEO/BV1OF4Y1K7XP/?SPM_ID_FROM=333.337.SEARCH-CARD.ALL.CLICK&VD_SOURCE=76D3264ACB028CC08FCCD0A145E89A77》 *

Also Published As

Publication number Publication date
CN112870694B (en) 2023-07-21

Similar Documents

Publication Publication Date Title
CN112090069B (en) Information prompting method and device in virtual scene, electronic equipment and storage medium
CN112402960B (en) State switching method, device, equipment and storage medium in virtual scene
CN112076473B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN113181650A (en) Control method, device, equipment and storage medium for calling object in virtual scene
CN113797536B (en) Control method, device, equipment and storage medium for objects in virtual scene
CN112711458B (en) Method and device for displaying prop resources in virtual scene
US20220266139A1 (en) Information processing method and apparatus in virtual scene, device, medium, and program product
CN113633964A (en) Virtual skill control method, device, equipment and computer readable storage medium
CN112057863A (en) Control method, device and equipment of virtual prop and computer readable storage medium
US20230033530A1 (en) Method and apparatus for acquiring position in virtual scene, device, medium and program product
CN112295230A (en) Method, device, equipment and storage medium for activating virtual props in virtual scene
CN113181649A (en) Control method, device, equipment and storage medium for calling object in virtual scene
CN112057860A (en) Method, device, equipment and storage medium for activating operation control in virtual scene
CN112870702B (en) Recommendation method, device and equipment for road resources in virtual scene and storage medium
CN112138385B (en) Virtual shooting prop aiming method and device, electronic equipment and storage medium
CN112121432A (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN114130006B (en) Virtual prop control method, device, equipment, storage medium and program product
CN112870694B (en) Picture display method and device of virtual scene, electronic equipment and storage medium
TWI831074B (en) Information processing methods, devices, equipments, computer-readable storage mediums, and computer program products in virtual scene
CN114146413A (en) Virtual object control method, device, equipment, storage medium and program product
CN117427336A (en) Information display method, device, equipment, medium and program product of virtual scene
CN117635891A (en) Model display method, device, equipment and storage medium in virtual scene
CN116920401A (en) Virtual object control method, device, equipment, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40045930

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant