CN117085322A - Interactive observation method, device, equipment and medium based on virtual scene - Google Patents

Interactive observation method, device, equipment and medium based on virtual scene Download PDF

Info

Publication number
CN117085322A
CN117085322A CN202311135567.5A CN202311135567A CN117085322A CN 117085322 A CN117085322 A CN 117085322A CN 202311135567 A CN202311135567 A CN 202311135567A CN 117085322 A CN117085322 A CN 117085322A
Authority
CN
China
Prior art keywords
virtual scene
scene
observation
angle
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311135567.5A
Other languages
Chinese (zh)
Inventor
张冲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202311135567.5A priority Critical patent/CN117085322A/en
Publication of CN117085322A publication Critical patent/CN117085322A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The application is a divisional application of China application 202110702790.8. The application discloses an interactive observation method, device, equipment and medium based on a virtual scene, and relates to the field of virtual environments. The method comprises the following steps: acquiring a first scene image, wherein the first scene image comprises a first object; displaying a virtual environment picture, wherein the virtual environment picture comprises a virtual scene and a matting object which are observed at a first observation angle, and the matting object comprises a first object obtained by matting a first scene image; receiving a viewing angle adjustment operation; the first viewing angle is adjusted to a second viewing angle based on the viewing angle adjustment operation. When the key image object is displayed in the virtual scene, the viewing angle of the virtual scene is adjusted through the viewing angle adjusting operation, so that the interactive diversity of the virtual scene is improved through the key image object, the angle adjusting mode for viewing the virtual scene is increased, and the viewing efficiency of the virtual scene is improved.

Description

Interactive observation method, device, equipment and medium based on virtual scene
The application relates to a method, a device and equipment for observing interaction based on virtual scenes and a medium for separating Chinese application with application number of 202110702790.8, application date of 2021, year 06 and month 24.
Technical Field
The embodiment of the application relates to the field of virtual environments, in particular to an interactive observation method, device, equipment and medium based on a virtual scene.
Background
The application program based on the virtual scene is a program which is operated based on the virtual environment after the virtual environment is constructed through a three-dimensional model, and when the application program is operated, a player can interact with the virtual environment by controlling the virtual object to move in the virtual environment.
In the related art, when a player controls a virtual object in a virtual environment, the virtual object can be controlled by touching a display screen, or can be controlled by inputting a control signal through an external input device, and the virtual object can move in the virtual environment according to the control of the player.
However, in the interactive observation process realized in the above manner, the virtual object is usually observed by switching between the first person viewing angle or the third person viewing angle, and the viewing angle direction in the observation process follows the line of sight direction of the virtual object, and the observation result shows that the result of the virtual object under the viewing angle is that the observation mode is single.
Disclosure of Invention
The embodiment of the application provides an interactive observation method, device, equipment and medium based on a virtual scene, which can improve the efficiency of observing the virtual environment. The technical scheme is as follows:
In one aspect, an interactive observation method based on a virtual scene is provided, and the method comprises the following steps:
acquiring a first scene image in response to a virtual scene display operation, wherein the first scene image comprises a first object;
displaying a virtual environment picture, wherein the virtual environment picture comprises a virtual scene and a matting object which are observed at a first observation angle, and the matting object comprises a first object obtained by matting the first scene image;
receiving a viewing angle adjustment operation;
based on the visual angle adjustment operation, a first observation angle for observing the virtual scene and the matting object is adjusted to be a second observation angle, wherein the first observation angle and the second observation angle are angles for observing the virtual scene from different observation positions or different observation distances.
In another aspect, an interactive viewing device based on a virtual scene is provided, the device comprising:
the acquisition module is used for responding to the virtual scene display operation to acquire a first scene image, wherein the first scene image comprises a first object;
the display module is used for displaying a virtual environment picture, wherein the virtual environment picture comprises a virtual scene and a matting object which are observed at a first observation angle, and the matting object comprises a first object obtained by matting the first scene image;
The receiving module is used for receiving the visual angle adjustment operation;
the display module is further configured to adjust a first observation angle at which the virtual scene and the matting object are observed to a second observation angle based on the viewing angle adjustment operation, where the first observation angle and the second observation angle are angles at which the virtual scene is observed from different observation positions or different observation distances.
In another aspect, a computer device is provided, where the computer device includes a processor and a memory, where the memory stores at least one instruction, at least one program, a set of codes, or a set of instructions, where the at least one instruction, the at least one program, the set of codes, or the set of instructions are loaded and executed by the processor to implement a virtual scene-based interactive observation method according to any one of the embodiments of the present application.
In another aspect, a computer readable storage medium is provided, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored, where the at least one instruction, the at least one program, the set of codes, or the set of instructions are loaded and executed by a processor to implement a virtual scene-based interactive viewing method according to any one of the embodiments of the present application.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the virtual scene-based interactive observation method according to any one of the above embodiments.
The technical scheme provided by the embodiment of the application has the beneficial effects that at least:
firstly, a matting object corresponding to a player is used for combining and displaying a virtual scene, when the matting object is displayed in the virtual scene, the observation view angle for observing the virtual scene is adjusted through the view angle adjustment operation, the view angle adjustment operation can be adjusted on the observation position and the observation distance, the interaction diversity of the virtual scene is improved through the matting object, meanwhile, the angle adjustment mode for observing the virtual scene is increased, and the observation efficiency of the virtual scene is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a matting object generation process provided by an exemplary embodiment of the present application;
FIG. 2 is a block diagram of an electronic device provided in an exemplary embodiment of the application;
FIG. 3 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 4 is an interface diagram of an interactive viewing method based on virtual scenes according to an exemplary embodiment of the present application;
FIG. 5 is a schematic diagram of a viewing angle adjustment provided based on the embodiment shown in FIG. 4;
FIG. 6 is an overall schematic of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 7 is an overall schematic of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 8 is a flow chart of a virtual scene-based interactive viewing method provided in another exemplary embodiment of the present application;
FIG. 9 is a schematic diagram of a viewing angle variation process provided based on the embodiment shown in FIG. 8;
FIG. 10 is a flow chart of a virtual scene-based interactive viewing method provided in another exemplary embodiment of the present application;
FIG. 11 is a schematic view of a directional adjustment provided based on the embodiment shown in FIG. 10;
FIG. 12 is an overall flow chart of an interactive viewing method provided by an exemplary embodiment of the present application;
FIG. 13 is a block diagram of an interactive viewing device based on virtual scenes according to an exemplary embodiment of the present application;
FIG. 14 is a block diagram of an interactive viewing device based on virtual scenes according to another exemplary embodiment of the present application;
fig. 15 is a block diagram of a terminal according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
First, the terms involved in the embodiments of the present application will be briefly described:
virtual environment: is a virtual environment that an application displays (or provides) while running on a terminal. The virtual environment may be a simulation environment for the real world, a semi-simulation and semi-imaginary environment, or a pure imaginary environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in the present application. The following embodiments are illustrated with the virtual environment being a three-dimensional virtual environment. In the embodiment of the application, the virtual environment is also called a virtual scene.
Matting objects: the method is to acquire a scene image through a live-action camera, and then to extract a designated object from the scene image. Illustratively, in the embodiment of the present application, a description will be given by taking a case where a person image is scratched from a scene image to obtain a scratched object. Referring to fig. 1 schematically, a schematic diagram of a process of generating a matting object according to an exemplary embodiment of the present application is shown, as shown in fig. 1, a scene is acquired by using a live-action camera 100 to acquire a scene image 110, where a person 120 is included in an image acquisition range of the live-action camera 100, so that the scene image 110 includes a corresponding object 121, and the object 121 is matting out of the scene image 110 to obtain a matting object 122.
In the embodiment of the application, the virtual scene and the matting object positioned in the virtual scene are displayed in the virtual environment picture, so that the experience that the player performs interaction in the virtual scene is created.
The terminal in the present application may be a desktop computer, a laptop computer, a mobile phone, a tablet computer, an electronic book reader, an MP3 (Moving Picture Experts Group Audio Layer III, moving picture experts compression standard audio layer 3) player, an MP4 (Moving Picture Experts Group Audio Layer IV, moving picture experts compression standard audio layer 4) player, or the like. The terminal has installed and running therein an application supporting a virtual environment, such as an application supporting a three-dimensional virtual environment. The application may be any one of a virtual reality application, a three-dimensional map application, a Third person shooter game (FPS), a First person shooter game (First-Person Shooting game, FPS), a multiplayer online tactical competition game (Multiplayer Online Battle Arena Games, MOBA). Alternatively, the application may be a stand-alone application, such as a stand-alone three-dimensional game, or a network-connected application.
Fig. 2 shows a block diagram of an electronic device according to an exemplary embodiment of the present application. The electronic device 200 includes: an operating system 220 and application programs 222.
Operating system 220 is the underlying software that provides applications 222 with secure access to computer hardware.
The application 222 is an application supporting a virtual environment. Alternatively, application 222 is an application that supports a three-dimensional virtual environment. The application 222 may be any one of a virtual reality application, a three-dimensional map program, a TPS game, an FPS game, a MOBA game, and a multiplayer warfare survival game. The application 222 may be a stand-alone application, such as a stand-alone three-dimensional game, or a network-connected application.
FIG. 3 illustrates a block diagram of a computer system provided in accordance with an exemplary embodiment of the present application. The computer system 300 includes: a first device 320, a server 340, and a second device 360.
The first device 320 installs and runs an application supporting a virtual environment. The application may be any one of a virtual reality application, a three-dimensional map program, a TPS game, an FPS game, a MOBA game, and a multiplayer warfare survival game. The first device 320 is a device used by a first user, the first user uses the first device 320 to control the first matting object in the virtual environment to move, wherein the first device 320 is configured with a first camera, and after the first camera performs image acquisition and matting on the first user or users in other image acquisition ranges, the first matting object is displayed in the virtual environment.
The first device 320 is connected to the server 340 via a wireless network or a wired network.
Server 340 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 340 is used to provide background services for applications supporting a three-dimensional virtual environment. Optionally, the server 340 takes on primary computing work, and the first device 320 and the second device 360 take on secondary computing work; alternatively, the server 340 performs the secondary computing job and the first device 320 and the second device 360 perform the primary computing job; alternatively, the server 340, the first device 320, and the second device 360 may perform collaborative computing using a distributed computing architecture.
The second device 360 installs and runs an application supporting a virtual environment. The application may be any one of a virtual reality application, a three-dimensional map program, an FPS game, a MOBA game, and a multiplayer gunfight survival game. The second device 360 is a device used by a second user, the second user uses the second device 360 to control the second matting object to move in the virtual environment, wherein the second device 360 is configured with a second camera, and after image acquisition and matting are performed on the second user or users in other image acquisition ranges through the second camera, the second matting object is displayed in the virtual environment.
Optionally, the first matting object and the second matting object are in the same virtual environment. Optionally, the first matting object and the second matting object may belong to the same team, the same organization, have a friend relationship, or have temporary communication authority. Alternatively, the first and second matting objects may also belong to different teams, different organizations, or two parties with hostility.
Alternatively, the applications installed on the first device 320 and the second device 360 are the same, or the applications installed on the two devices are the same type of application for different control system platforms. The first device 320 may refer broadly to one of a plurality of devices and the second device 360 may refer broadly to one of a plurality of devices, the present embodiment being illustrated with only the first device 320 and the second device 360. The device types of the first device 320 and the second device 360 are the same or different, and include: at least one of a game console, a desktop computer, a smart phone, a tablet computer, an electronic book reader, an MP3 player, an MP4 player, and a laptop portable computer. The following embodiments are illustrated with the device being a desktop computer.
Those skilled in the art will appreciate that the number of devices described above may be greater or lesser. Such as the above-mentioned devices may be only one, or the above-mentioned devices may be several tens or hundreds, or more. The number of devices and the types of the devices are not limited in the embodiment of the application.
It should be noted that, the server 340 may be implemented as a physical server or may be implemented as a Cloud server in the Cloud, where Cloud technology refers to a hosting technology that unifies serial resources such as hardware, software, and networks in a wide area network or a local area network to implement calculation, storage, processing, and sharing of data. The cloud technology is based on the general names of network technology, information technology, integration technology, management platform technology, application technology and the like applied by the cloud computing business mode, can form a resource pool, and is flexible and convenient as required. Cloud computing technology will become an important support. Background services of technical networking systems require a large amount of computing, storage resources, such as video websites, picture-like websites, and more portals. Along with the high development and application of the internet industry, each article possibly has an own identification mark in the future, the identification mark needs to be transmitted to a background system for logic processing, data with different levels can be processed separately, and various industry data needs strong system rear shield support and can be realized only through cloud computing.
In some embodiments, the method provided by the embodiment of the application can be applied to a cloud game scene, so that the calculation of data logic in the game process is completed through a cloud server, and the terminal is responsible for displaying a game interface.
In some embodiments, the server 340 described above may also be implemented as a node in a blockchain system. Blockchain (Blockchain) is a new application mode of computer technologies such as distributed data storage, point-to-point transmission, consensus mechanisms, encryption algorithms, and the like. The blockchain is essentially a decentralised database, and is a series of data blocks which are generated by association by using a cryptography method, and each data block contains information of a batch of network transactions and is used for verifying the validity (anti-counterfeiting) of the information and generating a next block. The blockchain may include a blockchain underlying platform, a platform product services layer, and an application services layer.
The application scene of the embodiment of the application comprises at least one of the following scenes:
first, the method is applied to a game scene, wherein the game can be realized as a cloud game, namely, the cloud server completes the calculation logic in the game process, and the terminal is used for completing the display logic in the game process.
Illustratively, the game may be implemented as at least one of a dance game, a shooting game, a educational game. The method comprises the steps that a player A collects a scene image through a first terminal provided with a first camera, and a matting object a corresponding to the player A is obtained by matting the scene image; and the player B acquires the scene image through a second terminal provided with a second camera, and the matting object B corresponding to the player B is obtained from the scene image by matting. And displaying the matting object a, the matting object B and the preset virtual scene in a terminal interface, so that the process that the player A and the player B interact in the virtual scene and participate in the game is realized.
The second method is applied to a live broadcast scene, wherein the live broadcast application program comprises a host and a spectator, wherein the host refers to a user creating a live broadcast room, the spectator refers to a user watching the live broadcast room, and the spectator can interact with the spectator in a virtual scene in the live broadcast room, or the host can interact with the spectator in the virtual scene.
Schematically, a host 1 creates a virtual scene interaction activity in a live broadcast room and invites audiences to participate, wherein the audiences 2 are invited to participate in the virtual scene interaction activity together with the host 1, the host 1 acquires a scene image through a first terminal provided with a first camera, and a matting object m corresponding to the host 1 is obtained from the scene image by matting; the audience 2 collects a scene image through a second terminal provided with a second camera, and a matting object n corresponding to the audience 2 is obtained by matting the scene image. And displaying the matting object m, the matting object n and the preset virtual scene in a terminal interface, so that interaction between the anchor 1 and the audience 2 in the virtual scene is realized. Wherein, other audiences except the audience 2 can watch the interaction process of the anchor 1 and the audience 2 in the virtual scene.
Referring to fig. 4, a flowchart of an interactive viewing method based on a virtual scene according to an exemplary embodiment of the present application is shown, and the method is applied to a first terminal configured with a camera, as shown in fig. 4, for illustration, and the method includes:
step 401, acquiring a first scene image by a camera in response to a virtual scene display operation.
In some embodiments, the virtual scene display operation refers to an operation in which a user instructs to turn on a virtual scene.
Illustratively, the implementation manner of the virtual scene display operation includes at least one of the following manners for different application scenes:
first, aiming at a cloud game application scene, a user triggers a game starting operation as a virtual scene display operation, namely, a cloud game is started for game play according to the game starting operation and a game interface is entered. The user can enter the cloud game for the game with the friend team, and can invite the friend team after entering the cloud game for the game.
Secondly, aiming at the live broadcast application scene, the anchor account triggers the interactive space opening operation as the virtual scene display operation, namely, the virtual scene of interaction between the anchor account and the audience account is opened according to the interactive space opening operation.
The first scene image comprises a first object located in a shooting range of a camera of the first terminal. That is, the first scene image refers to an image acquired by a camera configured by the first terminal, wherein the first terminal continuously acquires the first scene image in a video stream form, and in the process of acquiring the first scene image, the first object is in a shooting range of the camera and is displayed in the first scene image.
In some embodiments, when the terminal receives the virtual scene display operation, the camera is turned on and the first scene image is acquired through the camera.
After the first scene image is acquired, the first scene image is required to be subjected to matting processing, and a first object is obtained from the first scene image by matting. The process of matting the first scene image can be completed by the terminal or the server, wherein when the process of matting the first scene image is completed by the terminal, the first scene image is directly subjected to matting processing after being acquired by the camera, so that the data interaction amount between the terminal and the server is saved; when the process of matting the first scene image is completed by the server, after the first scene image is acquired by the camera, the first scene image is sent to the server, and the server performs matting on the first scene image to obtain a first object.
In some embodiments, the camera configured by the first terminal is a 2-dimensional camera, i.e. for capturing planar images. Or the camera configured by the first terminal is a 3-dimensional camera, that is, the scene depth information is acquired in the process of acquiring the image.
When the camera configured by the first terminal is a 2-dimensional camera, the data interaction between the terminal and the server is based on the plane image, so that the data interaction amount is reduced; when the camera configured by the first terminal is a 3-dimensional camera, the terminal can construct a three-dimensional model corresponding to the player according to the acquired depth information, and the authenticity of the matting object displayed in the virtual scene is improved.
Step 402, displaying a virtual environment picture, wherein the virtual environment picture comprises a virtual scene and a matting object which are observed at a first observation angle.
The matting object comprises a first object obtained by matting a first scene image and a second object obtained by matting a second scene image, wherein the second scene image is acquired by a second terminal provided with a camera.
It should be noted that, in this embodiment, the first object and the second object are described as examples included in the virtual scene, and in some alternative embodiments, only one object may be included in the virtual scene, or three or more objects may be included in the virtual scene, which is not limited in this embodiment.
In this embodiment, a server is taken as an example to perform matting on a scene image, and then a terminal sends a first scene image to the server and receives picture display data fed back by the server, where the picture display data includes scene data corresponding to a virtual scene and object data corresponding to a matting object.
The scene data of the virtual scene is data corresponding to the virtual scene determined according to the preset virtual scene; or the scene data is data corresponding to a selection result determined according to the selection of the virtual scene by the user, a scheme for selecting the virtual scene by the user is provided, and the diversity in the interaction of the virtual scene is increased; alternatively, the scene data is data corresponding to a random result determined from a virtual scene obtained randomly. This embodiment is not limited thereto.
The terminal displays a virtual environment picture based on the scene data and the object data fed back by the server.
In some embodiments, the display positions of the first object and the second object in the virtual scene are randomly determined in preset candidate positions, thereby increasing the uncertainty of object display; or the display positions of the first object and the second object in the virtual scene are indicated to the terminal by the server according to a preset display rule. Illustratively, when the object data includes an object display position, a virtual scene is displayed based on the scene data, and a matting object is displayed at the object display position in the virtual scene.
The current virtual environment picture is a picture obtained by observing the virtual scene at a first observation angle, wherein the first observation angle is an observation angle of a camera model for observing the virtual scene.
Optionally, each player corresponds to a camera model for observing the virtual scene in the virtual scene; or, each player is provided with n camera models for switching use, wherein the n camera models are used for observing the virtual scene in the virtual scene; or, the player meeting the observation switching condition corresponds to n camera models for observing the virtual scene in the virtual scene for switching, and the player not meeting the observation switching condition corresponds to one camera model for observing the virtual scene in the virtual environment, wherein n is an integer greater than 1.
In some embodiments, registration is required before the virtual environment picture is displayed, so that the object is distinguished from the image background, and the matting accuracy of the object is improved. Optionally, a calibration screen is displayed, wherein the calibration screen comprises a first scene image, and the first scene image comprises an indication frame and an indication line. The indication frame is used for carrying out frame selection indication on the first object, and the indication line is positioned at the appointed position of the first scene image and divides the first scene image into a first area and a second area. The background portion of the first scene image is indicated by acquiring a phase image of the first object moving from the first position to the second position. The first position is a position of the indication frame in the first area, and the second position is a position of the indication frame in the second area. That is, the indication frame is controlled to move from the first area of the first scene image to the second area of the first scene image, so that when the indication frame is displayed in the first area, the content displayed in the second area is a complete background image, when the indication frame is displayed in the second area, the content displayed in the first area is a complete background image, the two complete background images are combined, and then the complete background images except the first object in the first scene image can be obtained, and then when the first object in the first scene image is scratched, the process of scratching the image can be realized according to the identified background image, so that the accuracy of the scratching result is improved.
In some embodiments, in response to the registration process of the first object and the second object being completed, displaying a virtual environment picture, performing matting processing on the first object in the first scene image according to a first background area after registration of the first object by the server, and performing matting processing on the second object in the second scene image according to a second background area after registration of the second object by the server, thereby displaying the first object and the second object in the virtual scene.
In step 403, a viewing angle adjustment operation is received.
In some embodiments, a player-triggered viewing angle adjustment operation is received in response to the player meeting an observation switch condition. Implementations of the viewing angle adjustment operation include at least one of:
first, through external input devices, such as: the keyboard, mouse, etc. receives the key-in operation, and determines the key-in operation corresponding to the angle-of-view adjustment function as the angle-of-view adjustment operation.
The execution of the visual angle adjustment operation can be accurately and rapidly realized through the external input equipment, and the execution accuracy of the visual angle adjustment operation is improved.
Second, an angle adjustment control is displayed on the virtual environment picture, and the viewing angle adjustment operation is an operation received on the angle adjustment control.
The execution of the visual angle adjustment operation can be accurately and rapidly realized through the angle adjustment control, and the execution accuracy of the visual angle adjustment operation is improved.
Third, the viewing angle control is performed by voice input, and then the voice input operation is received as the viewing angle adjustment operation.
The virtual environment related to the embodiment of the application needs to be displayed in combination with the player, namely, the action of the player is recorded in the game process, so that the situation that the external input device or the intelligent device is inconvenient to operate exists, the visual angle control is performed in a voice input mode, the external input device or the intelligent device is not required to be additionally operated by the player, and the visual angle control efficiency is improved.
Fourth, an angle adjustment control is displayed in the virtual environment picture, and a player puts out a corresponding gesture in the real scene, so that when the player draws a picture through the server and displays the picture in the virtual environment picture, the action executed by the player enables the drawing object to be contacted with the angle adjustment control, and the contact operation between the drawing object and the angle adjustment control is used as a corresponding angle adjustment operation. Schematically, referring to fig. 5, after a player puts a gesture in a display scene, a server performs matting on a scene image to obtain a matting object 510 corresponding to the player, and when the matting object 510 is mapped into a virtual environment screen 500, a foot 511 of the matting object coincides with an angle adjustment control 520 in the virtual environment screen 500, and is used as a contact operation with the angle adjustment control 520, so as to determine an angle adjustment operation.
The virtual environment related to the embodiment of the application needs to be displayed in combination with the player, namely, the action of the player is recorded in the game process, and the angle adjustment is realized by the contact of the action executed by the player and the angle adjustment control, so that the diversity of interaction between the player and the virtual scene is increased, and the interaction capability of the virtual scene is enhanced.
It should be noted that the implementation of the above-mentioned viewing angle adjustment operation is merely an illustrative example, and the embodiments of the present application are not limited thereto.
Step 404, based on the viewing angle adjustment operation, adjusting a first viewing angle for viewing the virtual scene and the matting object to a second viewing angle.
Optionally, the first viewing angle and the second viewing angle are angles at which the virtual scene is viewed from different viewing positions or different viewing distances.
Wherein, the adjustment mode of the observation angle corresponds to the adjustment operation of the visual angle; alternatively, the adjustment mode of the observation angle is preset.
Illustratively, when the adjustment of the viewing angle corresponds to the viewing angle adjustment operation, the following are: and the left adjustment control and the right adjustment control are displayed in the virtual environment picture, and when the angle adjustment control for the left adjustment control is received, the first observation angle is adjusted leftwards, so that the second observation angle is obtained.
When the adjustment mode of the observation angle is preset, for example: and the virtual environment picture is displayed with a visual angle adjustment control, and when the visual angle adjustment operation aiming at the visual angle adjustment control is received, the visual angle is adjusted towards the preset direction, so that a second observation angle is obtained.
The adjustment from the first observation angle to the second observation angle comprises rotation adjustment in the horizontal direction in the virtual scene; alternatively, the distance in the viewing direction is adjusted; alternatively, any adjustment in the spherical direction, to which embodiments of the present application are not limited, may be used.
Schematically, as shown in fig. 6, a schematic diagram of a viewing angle adjustment manner provided by an exemplary embodiment of the present application is shown. The camera model 601 observes the virtual scene on the spherical observation model 600 in a direction towards the sphere center, and when the camera model rotates along the observation track 610, the adjustment from the first observation angle to the second observation angle is the rotation adjustment in the horizontal direction in the virtual scene; when the camera model 601 performs distance change on the line connecting with the sphere center, the distance adjustment in the observation direction is performed; when the camera model 601 rotates arbitrarily on the sphere, then this is an arbitrary adjustment in the spherical direction.
The rotation adjustment observation in the horizontal direction in the virtual scene is realized by transversely carrying out surrounding type observation on the virtual scene, namely, the virtual scene is completely observed from the transverse direction of 360 degrees, so that each detail in the virtual scene is displayed, and the observation efficiency is improved.
The adjustment of the distance between the observation point and the matting object in the observation direction in the virtual scene is realized, namely, the adjustment of the distance between the observation point and the matting object in the observation of the virtual scene is realized, and more observation details can be obtained when the matting object is observed with a closer distance, so that a close-range observation effect is generated; when the matting object is observed at a longer distance, more comprehensive information of the matting object combined with the virtual scene can be obtained, and a long-range observation effect is generated.
The adjustment of any direction in the virtual scene is realized as the adjustment of the omnibearing observation angle when the virtual scene is observed, the situation that the matting object is combined with the virtual scene is displayed from different angles, and the diversity of observation modes is increased.
In summary, according to the interactive observation method based on the virtual scene provided by the embodiment of the application, firstly, the scratch objects corresponding to the players are used for combined display of the virtual scene, and when the scratch objects are displayed in the virtual scene, the observation view angle for observing the virtual scene is adjusted through the view angle adjustment operation, so that the interactive diversity of the virtual scene is improved through the scratch objects, and meanwhile, the angle adjustment mode for observing the virtual scene is increased, and the observation efficiency of the virtual scene is improved.
Schematically, fig. 7 is an overall schematic diagram of an implementation environment provided by an exemplary embodiment of the present application, where, as shown in fig. 7, player a operates a smart device 710 to participate in a game, and the smart device 710 is configured with a camera to collect images of player a; player B operates the smart device 720 for game participation, and the smart device 720 is configured with a camera to capture images of player B. The server 740 receives the collected scene images sent by the intelligent devices 710 and 720, and performs matting on the scene images to obtain corresponding matting objects of the player a and the player B, and feeds back the virtual scene and the matting objects to the viewing terminal 750 for display, wherein the intelligent devices 710 and 720 also belong to the viewing terminal.
In some embodiments, since the matting object is implemented as a two-dimensional image, in the process of adjusting the viewing angle, it is also necessary to control the matting object to display the complete image content by controlling the display direction of the matting object. Fig. 8 is a flowchart of an interactive viewing method based on a virtual scene according to another exemplary embodiment of the present application, and the method is applied to a first terminal configured with a camera, for example, as shown in fig. 8, and includes:
Step 801, a first scene image is acquired by a camera in response to a virtual scene display operation.
In some embodiments, the virtual scene display operation refers to an operation in which a user instructs to turn on a virtual scene.
The first scene image comprises a first object located in a shooting range of a camera of the first terminal. That is, the first scene image refers to an image acquired by a camera configured by the first terminal.
In some embodiments, when the terminal receives the virtual scene display operation, the camera is turned on and the first scene image is acquired through the camera.
Step 802, displaying a virtual environment picture, wherein the virtual environment picture comprises a virtual scene and a matting object which are observed at a first observation angle.
The matting object comprises a first object obtained by matting a first scene image and a second object obtained by matting a second scene image, wherein the second scene image is acquired by a second terminal provided with a camera.
The current virtual environment picture is a picture obtained by observing the virtual scene at a first observation angle, wherein the first observation angle is an observation angle of a camera model for observing the virtual scene.
Step 803, receive a viewing angle adjustment operation.
In some embodiments, a player-triggered viewing angle adjustment operation is received in response to the player meeting an observation switch condition.
Step 804, based on the viewing angle adjustment operation, the first viewing angle for viewing the virtual scene is adjusted to the second viewing angle.
In some embodiments, the terminal corresponds to a camera model in the virtual scene, which is used to adjust the viewing angle. The camera model is adjusted from a first viewing position for viewing the virtual scene at a first viewing angle to a second viewing position for viewing the virtual scene at a second viewing angle based on the viewing angle adjustment operation. In some embodiments, the first viewing position and the second viewing position are two viewing positions on the same sphere, the first viewing angle being the angle of view from the first viewing position toward the center of the sphere, the second viewing angle being the angle of view from the second viewing position toward the center of the sphere.
In some embodiments, the angular difference (or distance difference) between the first viewing angle and the second viewing angle is preset; or, the angle difference (or distance difference) between the first observation angle and the second observation angle is determined according to the viewing angle adjustment operation, so that a player can select a required observation angle in a stable change relation, and the observation accuracy is improved; or the second observation angle is obtained by randomly adjusting a certain angle difference (or distance difference) on the basis of the first observation angle, so that a player can randomly determine one observation angle to observe the virtual scene, and the interestingness of observing the virtual scene is improved. The embodiment of the application does not limit the change relation between the first observation angle and the second observation angle.
That is, each player corresponds to one camera model in the virtual scene, and the virtual environment picture is collected through the camera model, so that when the observation angle is adjusted, the adjustment of the observation angle can be realized only by rotating the observation position of the camera model.
In other embodiments, a camera model corresponds to the virtual scene, wherein the first viewing angle corresponds to the first camera model and the second viewing angle corresponds to the second camera model, and based on the viewing angle adjustment operation, the viewing of the virtual scene by the first camera model is switched to the viewing of the virtual scene by the second camera model.
That is, each player corresponds to at least two camera models in the virtual scene; or, the current player corresponds to at least two camera models in the virtual scene, so that when the observation angle is adjusted, the adjustment of the observation angle can be realized only by switching between different camera models.
When the observation angle is adjusted through switching between at least two camera models, the adjustment of the observation angle can be realized by directly switching between a plurality of set camera models, the rotation and movement angles of the camera models are not required to be calculated, and the calculation amount of a server is reduced.
When the single camera model is rotated and moved to adjust the observation angle, the rotation angle of the camera model can be accurately controlled, and the accuracy of switching the observation angle is improved.
Step 805, keeping the position of the matting object in the virtual scene unchanged, so as to display the matting object in a direction facing the second observation angle.
Because the scene image is realized as a two-dimensional image, a matting object obtained from the matting in the scene image is also realized as a two-dimensional image, and if the display direction of the matting object is unchanged, the matting object is easy to be displayed as a' paper sheet or an incomplete portrait when the virtual scene is observed at other visual angles. Therefore, in the embodiment of the application, the object to be matted is kept facing the direction of the observation visual angle at all times.
In some embodiments, the virtual scene is observed through the camera model, so that the position of the matting object in the virtual scene is kept unchanged, and the matting object is controlled to be displayed towards the camera model corresponding to the second observation view angle.
Referring to fig. 9 schematically, a schematic diagram of an observation angle change process provided by an exemplary embodiment of the present application is shown, as shown in fig. 9, a virtual environment screen 910 is a screen obtained by observing a virtual scene under a first observation angle through a camera model, the virtual environment screen 910 includes a matting object 901 and a matting object 902, and the matting object 901 and the matting object 902 are displayed in a direction facing the first observation angle; when the viewing angle adjustment operation is received, a virtual environment picture 920 obtained by observing the virtual scene at a second observation angle through the camera model is displayed, wherein the virtual environment picture 920 comprises a matting object 901 and a matting object 902, and the matting object 901 and the matting object 902 are displayed in a direction facing the second observation viewing angle.
That is, whether the virtual scene is observed from a first viewing angle or from a second viewing angle, the matting object remains displayed intact in a direction facing the viewing angle. In some embodiments, after determining the viewing angle, acquiring an image of the virtual scene at the viewing angle, and attaching the matting object to the image of the virtual scene as a picture of the matting object in the virtual scene.
In summary, according to the interactive observation method based on the virtual scene provided by the embodiment of the application, firstly, the scratch objects corresponding to the players are used for combined display of the virtual scene, and when the scratch objects are displayed in the virtual scene, the observation view angle for observing the virtual scene is adjusted through the view angle adjustment operation, so that the interactive diversity of the virtual scene is improved through the scratch objects, and meanwhile, the angle adjustment mode for observing the virtual scene is increased, and the observation efficiency of the virtual scene is improved.
According to the method provided by the embodiment, when the virtual scene is observed by switching the visual angles, the display position of the matting object is kept unchanged, and the display direction of the matting object is correspondingly adjusted, so that the matting object is ensured to face the observation angle to be displayed, and the problem of observation distortion of the matting object due to the fact that the matting object is realized as a two-dimensional image is avoided.
In some embodiments, the viewing angle adjustment operation includes at least two of a direction adjustment and a distance adjustment. Fig. 10 is a flowchart of an interactive viewing method based on a virtual scene according to another exemplary embodiment of the present application, and the method is applied to a first terminal configured with a camera, for example, as shown in fig. 10, and includes:
in step 1001, a first scene image is acquired by a camera in response to a virtual scene display operation.
In some embodiments, the virtual scene display operation refers to an operation in which a user instructs to turn on a virtual scene.
The first scene image comprises a first object located in a shooting range of a camera of the first terminal. That is, the first scene image refers to an image acquired by a camera configured by the first terminal.
In some embodiments, when the terminal receives the virtual scene display operation, the camera is turned on and the first scene image is acquired through the camera.
Step 1002, displaying a virtual environment picture, wherein the virtual environment picture comprises a virtual scene and a matting object which are observed at a first observation angle.
The matting object comprises a first object obtained by matting a first scene image and a second object obtained by matting a second scene image, wherein the second scene image is acquired by a second terminal provided with a camera.
The current virtual environment picture is a picture obtained by observing the virtual scene at a first observation angle, wherein the first observation angle is an observation angle of a camera model for observing the virtual scene.
In step 1003, a viewing angle adjustment operation is received.
Optionally, an angle adjustment control is displayed on the virtual environment screen, and the viewing angle adjustment operation is an operation received on the angle adjustment control.
Considering that the images of the player can be collected and displayed in the virtual scene, in the embodiment of the application, the operation of triggering the visual angle adjustment by touching the matting object and the angle adjustment control on the virtual environment picture is taken as an example for explanation.
In some embodiments, the angle adjustment control includes a direction adjustment region; the direction adjustment operation on the direction adjustment area is received when the adjustment of the viewing direction is required. Illustratively, the direction adjustment area comprises a left adjustment control, a right adjustment control, an upper adjustment control and a lower adjustment control; illustratively, taking a left-turning control as an example, when a trigger operation on the left-turning control is received, rotating an observation position corresponding to a first observation angle to the left by a preset angle or a random angle, that is, rotating the observation position corresponding to the first observation angle to the left by a certain angle to a new observation position on an observation sphere; or, the camera model used for observing the virtual scene is switched from the camera model at the first observation position corresponding to the first observation angle to the camera model at the second observation position after rotating leftwards by a certain angle.
In some embodiments, the angle adjustment control includes a distance adjustment region, and when an adjustment of the viewing distance is required, a distance adjustment operation on the distance adjustment region is received, so that the distance of the viewing virtual scene is adjusted based on the distance adjustment operation, wherein the distance adjustment operation is used to adjust the position of the viewing point in the viewing direction. Optionally, the distance adjustment area can adjust the observation distance within a preset distance range, that is, the distance adjustment area has an adjusted upper observation distance limit and an adjusted lower observation distance limit, so as to avoid the phenomenon that the observation distance is excessively adjusted, and the distortion phenomenon of the matting object is generated due to the two-dimensional image when the observation distance is relatively close.
Optionally, the upper and lower viewing distance limits refer to upper and lower viewing distance limits between the viewing position and the viewing center. That is, when the distance between the observation position and the observation center reaches the minimum distance (lower limit of the observation distance), the observation position cannot be adjusted to a position close to the observation center; when the distance between the observation position and the observation center reaches the maximum distance (upper limit of the observation distance), the observation position cannot be adjusted to a position away from the observation center. By setting the upper limit and the lower limit of the observation distance, the detail expression when the virtual scene and the matting object are observed is limited, so that the large calculation amount consumed by detail rendering when the observation distance is too short is avoided, the unclear representation of interface content when the observation distance is too long is also avoided, and the observation efficiency is low.
Step 1004, determining a corresponding adjustment mode of the angle adjustment operation on the angle adjustment control.
In some embodiments, it is determined whether the viewing angle adjustment operation is an operation for a direction adjustment region or an operation for a distance adjustment region. When the viewing angle adjustment operation is an operation for the direction adjustment region, rotating an angle at which the virtual scene is observed; when the view angle adjustment operation is an operation for the distance adjustment region, the angle at which the virtual scene is observed is zoomed in or zoomed out, that is, the angle at which the virtual scene is observed is zoomed in, the virtual scene and the objects in the virtual scene are enlarged and displayed, and the angle at which the virtual scene is observed is zoomed out, the virtual scene and the objects in the virtual scene are reduced and displayed.
For illustration, please refer to fig. 11, an angle adjustment control 1110 is displayed on the virtual environment screen 1100, wherein the angle adjustment control 1110 includes a direction adjustment area 1111 and a distance adjustment area 1112. When receiving the direction adjustment operation on the direction adjustment region 1111, adjusting the angle at which the virtual scene is observed in a manner of rotating the camera model; when a distance adjustment operation on the distance adjustment area 1112 is received, the angle at which the virtual scene is observed is adjusted in such a manner that the camera model is zoomed in or zoomed out.
In some embodiments, the display of the direction adjustment region 1111 and the distance adjustment region 1112 are in a linkage relationship. Illustratively, upon receiving a direction adjustment operation on the direction adjustment region 1111, the viewing direction is adjusted in accordance with the direction adjustment operation and the distance adjustment region 1112 is displayed in association, whereby the viewing distance is adjusted by receiving the distance adjustment operation on the distance adjustment region 1112. Alternatively, when receiving the adjustment of the observation distance by the reception distance adjustment operation in the distance adjustment area 1112, the direction adjustment area 1111 is displayed in association with the adjustment of the observation distance, and when receiving the direction adjustment operation in the direction adjustment area 1111, the observation direction is adjusted according to the direction adjustment operation.
Alternatively, an adjustment option may be displayed on the angle adjustment control 1110, including a direction option and a distance option, and when a selection operation of the direction option is received, the direction adjustment region 1111 is displayed and may be returned to the display adjustment option through the confirmation control; when a selection operation of the distance option is received, the distance adjustment area 1112 is displayed and may return to displaying the adjustment option through the confirmation control.
In some embodiments, the viewing angle adjustment operation includes both an operation in the direction adjustment region and an operation on the distance adjustment region, and the adjustment mode includes both an adjustment in direction and distance, and a corresponding adjustment in both the viewing direction and viewing distance is required.
In step 1005, the first viewing angle for viewing the virtual scene is adjusted to the second viewing angle in an adjusted manner.
When the adjustment mode comprises adjustment in the observation direction, the camera model is rotated or switched to realize adjustment of the observation direction; that is, the adjustment of the viewing direction is achieved by rotating the viewing position of a single camera model in the virtual scene or by switching between multiple camera models. Such as: when the viewing direction is adjusted to the right, the camera model is rotated to the right around the viewing center, or the first camera model is switched to the second camera model on the right side for viewing.
When the adjustment mode comprises adjustment on the observation distance, the camera model is subjected to position transformation in the observation direction to realize adjustment on the observation distance; that is, in the observation direction of the camera model, the camera model is adjusted in a direction approaching the observation center, or the camera model is adjusted in a direction separating from the observation center, thereby achieving adjustment of the observation distance.
When the adjustment mode includes both adjustment in the viewing direction and adjustment in the viewing distance, the viewing position of the camera model in the viewing direction is adjusted while the camera model is rotated or switched.
In summary, according to the interactive observation method based on the virtual scene provided by the embodiment of the application, firstly, the scratch objects corresponding to the players are used for combined display of the virtual scene, and when the scratch objects are displayed in the virtual scene, the observation view angle for observing the virtual scene is adjusted through the view angle adjustment operation, so that the interactive diversity of the virtual scene is improved through the scratch objects, and meanwhile, the angle adjustment mode for observing the virtual scene is increased, and the observation efficiency of the virtual scene is improved.
The method provided by the embodiment improves the diversity of the observation modes for observing the virtual scene when the observation angle is adjusted, including adjustment in the observation direction and adjustment in the observation distance.
Schematically, the interaction method based on the virtual scene provided by the embodiment of the application is applied to the cloud game. Fig. 12 is an overall flowchart of an interactive viewing method according to an exemplary embodiment of the present application, as shown in fig. 12, where the process includes:
in step 1201, the cloud server starts a cloud game.
The cloud game is run in the cloud server so that the player can access the cloud server to play the cloud game.
Step 1202, a player logs into a lobby.
In some embodiments, multiple players log into a cloud gaming lobby using an account system.
At step 1203, the player joins the room through the lobby.
In embodiments of the present application, there is a cloud game room that includes at least two players, and at least two players can interact with each other.
In step 1204, the cloud server initializes player data.
The cloud server firstly initializes game account data corresponding to the player.
In step 1205, the cloud server creates a personal rendering camera.
The cloud server creates a virtual camera unique to the player in the game scene, binds with the player one by one, and is used for capturing a game picture of a specified angle and transmitting the game picture back to the specified player.
In step 1206, the cloud server creates a personal audio group.
The personal audio group is an audio group corresponding to a player and is used for conveying independent audio content to the player, and illustratively, when the player A and the player B are in the same virtual scene, the player A selects the audio a as background audio and the player B selects the audio B as background audio, data of the audio a is transmitted to the player A, and data of the audio B is transmitted to the player B for independent playing.
In step 1207, the player establishes a connection with the cloud server and exchanges codec information.
The method comprises the steps that encoding and decoding are corresponding processes of video processing, a cloud server encodes game process videos and then sends the encoded videos to a terminal, and the terminal decodes the encoded videos to obtain decoded video streams to be played; the terminal acquires the video stream through the camera, encodes the video stream acquired by the camera, and sends the encoded video stream to the cloud server for decoding, subsequent matting and other processing.
At step 1208, the cloud server sends the camera-rendered video stream to the player, and sends the encoded video stream.
The cloud server renders video content of the player, including obtaining a matting object, virtual scene data, display position data and the like, renders video stream according to the calculated display data, and sends video stream codes to the terminal.
In step 1209, the player performs data stream simulation input.
In some embodiments, the player first receives a portion of the test data for analog input, and when the test data input is successful, determines that the player successfully performs the control operation of the cloud game, and is then able to further control the camera or other control.
In step 1210, the cloud server controls player cameras/operations.
That is, the cloud server acquires an observation angle control operation performed by the player, and adjusts the position of the camera model according to the observation angle control operation, thereby achieving adjustment of the observation angle.
Fig. 13 is a block diagram of an interaction observation device based on a virtual scene according to an exemplary embodiment of the present application, as shown in fig. 13, and the device is applied to a first terminal configured with a camera, where the device includes:
an acquisition module 1310, configured to acquire, in response to a virtual scene display operation, a first scene image through the camera, where the first scene image includes a first object located in a camera shooting range of the first terminal;
a display module 1320, configured to display a virtual environment picture, where the virtual environment picture includes a virtual scene and a matting object that are observed at a first observation angle, and the matting object includes the first object obtained by matting the first scene image and a second object obtained by matting a second scene image, where the second scene image is an image acquired by a second terminal configured with a camera;
A receiving module 1330 for receiving a viewing angle adjustment operation;
the display module 1320 is further configured to adjust a first observation angle for observing the virtual scene and the matting object to a second observation angle based on the viewing angle adjustment operation.
In an optional embodiment, the display module 1320 is further configured to adjust a first viewing angle for viewing the virtual scene to a second viewing angle based on the viewing angle adjustment operation; and keeping the position of the matting object in the virtual scene unchanged so as to display the matting object in a direction facing the second observation angle.
In an optional embodiment, the terminal corresponds to a camera model in the virtual scene, and the camera model is used for adjusting the observation angle;
as shown in fig. 14, the apparatus further includes:
an adjustment module 1340 for adjusting the camera model from a first viewing position for viewing the virtual scene at a first viewing angle to a second viewing position for viewing the virtual scene at a second viewing angle based on the viewing angle adjustment operation.
In an alternative embodiment, a camera model is corresponding to the virtual scene, wherein the first observation angle corresponds to a first camera model, and the second observation angle corresponds to a second camera model;
the device further comprises:
an adjustment module 1340 for switching from observing the virtual scene through the first camera model to observing the virtual scene through the second camera model based on the viewing angle adjustment operation.
In an alternative embodiment, the virtual scene is correspondingly provided with a camera model;
the display module 1320 is further configured to keep a position of the matting object in the virtual scene unchanged; and controlling the matting object to face the camera model corresponding to the second observation visual angle to display.
In an optional embodiment, an angle adjustment control is displayed on the virtual environment screen, and the view angle adjustment operation is an operation received on the angle adjustment control;
the apparatus further comprises:
the determining module is used for determining a corresponding adjusting mode of the visual angle adjusting operation on the angle adjusting control;
the display module 1320 is further configured to adjust the first viewing angle for viewing the virtual scene to the second viewing angle in the adjustment manner.
In an alternative embodiment, the angle adjustment control includes a direction adjustment area;
the receiving module 1330 is further configured to receive a direction adjustment operation on the direction adjustment area;
the display module 1320 is further configured to adjust the first observation angle for observing the virtual scene to the second observation angle based on the adjustment direction corresponding to the direction adjustment operation.
In an alternative embodiment, the angle adjustment control includes a distance adjustment region;
the receiving module 1330 is further configured to receive a distance adjustment operation on the distance adjustment area, where the distance adjustment operation is used to adjust a position of the observation point in the observation direction;
the device further comprises:
an adjustment module 1340 for adjusting an observation distance for observing the virtual scene based on the distance adjustment operation.
In summary, according to the interactive observation device based on the virtual scene provided by the embodiment of the application, firstly, the scratch objects corresponding to the players are used for combined display of the virtual scene, and when the scratch objects are displayed in the virtual scene, the observation view angle for observing the virtual scene is adjusted through the view angle adjustment operation, so that the interactive diversity of the virtual scene is improved through the scratch objects, and meanwhile, the angle adjustment mode for observing the virtual scene is increased, and the observation efficiency of the virtual scene is improved.
It should be noted that: the interaction observation device based on virtual scene provided in the above embodiment is only exemplified by the division of the above functional modules, and in practical application, the above functional allocation may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the interactive observation device based on the virtual scene provided in the above embodiment and the interactive observation method embodiment based on the virtual scene belong to the same concept, and the detailed implementation process of the interactive observation device based on the virtual scene is detailed in the method embodiment, which is not described herein again.
Fig. 15 shows a block diagram of a terminal 1500 according to an exemplary embodiment of the present application. The terminal 1500 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Terminal 1500 can also be referred to as a user device, portable terminal, laptop terminal, desktop terminal, and the like.
In general, the terminal 1500 includes: a processor 1501 and a memory 1502.
The processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1501 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 1501 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with a GPU (Graphics Processing Unit, image processor) for rendering and rendering of content required to be displayed by the display screen. In some embodiments, the processor 1501 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 1502 may include one or more computer-readable storage media, which may be non-transitory. Memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1502 is configured to store at least one instruction for execution by processor 1501 to implement the virtual scene based interactive viewing method provided by the method embodiments of the present application.
In some embodiments, the terminal 1500 may further optionally include: a peripheral interface 1503 and at least one peripheral device. The processor 1501, memory 1502 and peripheral interface 1503 may be connected by a bus or signal lines. The individual peripheral devices may be connected to the peripheral device interface 1503 via a bus, signal lines, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1504, a display screen 1505, a camera assembly 1506, audio circuitry 1507, a positioning assembly 1508, and a power supply 1509.
A peripheral interface 1503 may be used to connect I/O (Input/Output) related at least one peripheral device to the processor 1501 and the memory 1502. In some embodiments, processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 1501, the memory 1502, and the peripheral interface 1503 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1504 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 1504 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 1504 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuit 1504 may also include NFC (Near Field Communication, short range wireless communication) related circuits, which the present application is not limited to.
Display 1505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When display screen 1505 is a touch display screen, display screen 1505 also has the ability to collect touch signals at or above the surface of display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. At this point, display 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1505 may be one, providing a front panel of the terminal 1500; in other embodiments, the display 1505 may be at least two, respectively disposed on different surfaces of the terminal 1500 or in a folded design; in still other embodiments, the display 1505 may be a flexible display disposed on a curved surface or a folded surface of the terminal 1500. Even more, the display 1505 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The display screen 1505 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, the camera assembly 1506 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuitry 1507 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, inputting the electric signals to the processor 1501 for processing, or inputting the electric signals to the radio frequency circuit 1504 for voice communication. For purposes of stereo acquisition or noise reduction, a plurality of microphones may be respectively disposed at different portions of the terminal 1500. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 1501 or the radio frequency circuit 1504 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 1507 may also include a headphone jack.
The positioning component 1508 is for positioning a current geographic location of the terminal 1500 to enable navigation or LBS (Location Based Service, location-based services).
The power supply 1509 is used to power the various components in the terminal 1500. The power supply 1509 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power supply 1509 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 1500 also includes one or more sensors 1510. The one or more sensors 1510 include, but are not limited to: acceleration sensor 1511, gyroscope sensor 1512, pressure sensor 1513, fingerprint sensor 1514, optical sensor 1515, and proximity sensor 1516.
The acceleration sensor 1511 may detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal 1500. For example, the acceleration sensor 1511 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1501 may control the touch display screen 1505 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 1511. The acceleration sensor 1511 may also be used for the acquisition of motion data of a game or user.
The gyro sensor 1512 may detect a body direction and a rotation angle of the terminal 1500, and the gyro sensor 1512 may collect 3D motion of the terminal 1500 by a user in cooperation with the acceleration sensor 1511. The processor 1501, based on the data collected by the gyro sensor 1512, may implement the following functions: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
Pressure sensor 1513 may be disposed on a side frame of terminal 1500 and/or below touch display 1505. When the pressure sensor 1513 is disposed on the side frame of the terminal 1500, a grip signal of the user on the terminal 1500 may be detected, and the processor 1501 performs left-right hand recognition or quick operation according to the grip signal collected by the pressure sensor 1513. When the pressure sensor 1513 is disposed at the lower layer of the touch display screen 1505, the processor 1501 realizes control of the operability control on the UI interface according to the pressure operation of the user on the touch display screen 1505. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 1514 is used for collecting the fingerprint of the user, and the processor 1501 recognizes the identity of the user according to the collected fingerprint of the fingerprint sensor 1514, or the fingerprint sensor 1514 recognizes the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1501 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 1514 may be provided on the front, back, or side of the terminal 1500. When a physical key or vendor Logo is provided on the terminal 1500, the fingerprint sensor 1514 may be integrated with the physical key or vendor Logo.
The optical sensor 1515 is used to collect the ambient light intensity. In one embodiment, processor 1501 may control the display brightness of touch display screen 1505 based on the intensity of ambient light collected by optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1505 is turned up; when the ambient light intensity is low, the display brightness of the touch display screen 1505 is turned down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 based on the ambient light intensity collected by the optical sensor 1515.
A proximity sensor 1516, also referred to as a distance sensor, is typically provided on the front panel of the terminal 1500. The proximity sensor 1516 is used to collect the distance between the user and the front of the terminal 1500. In one embodiment, when the proximity sensor 1516 detects a gradual decrease in the distance between the user and the front of the terminal 1500, the processor 1501 controls the touch display 1505 to switch from the on-screen state to the off-screen state; when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually increases, the touch display screen 1505 is controlled by the processor 1501 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 15 is not limiting and that more or fewer components than shown may be included or certain components may be combined or a different arrangement of components may be employed.
Alternatively, the computer-readable storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), solid state disk (SSD, solid State Drives), or optical disk, etc. The random access memory may include resistive random access memory (ReRAM, resistance Random Access Memory) and dynamic random access memory (DRAM, dynamic Random Access Memory), among others. The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments of the present application is not intended to limit the application, but rather, the application is to be construed as limited to the appended claims.

Claims (16)

1. An interactive observation method based on a virtual scene, which is characterized by comprising the following steps:
acquiring a first scene image in response to a virtual scene display operation, wherein the first scene image comprises a first object;
displaying a virtual environment picture, wherein the virtual environment picture comprises a virtual scene and a matting object which are observed at a first observation angle, and the matting object comprises a first object obtained by matting the first scene image;
receiving a viewing angle adjustment operation;
based on the visual angle adjustment operation, a first observation angle for observing the virtual scene and the matting object is adjusted to be a second observation angle, wherein the first observation angle and the second observation angle are angles for observing the virtual scene from different observation positions or different observation distances.
2. A method as recited in claim 1, wherein the adjusting a first viewing angle at which the virtual scene and the matting object are viewed to a second viewing angle based on the viewing angle adjustment operation comprises:
based on the viewing angle adjustment operation, adjusting a first viewing angle at which the virtual scene is viewed to a second viewing angle;
And keeping the position of the matting object in the virtual scene unchanged so as to display the matting object in a direction facing the second observation angle.
3. The method of claim 2, wherein the virtual scene corresponds to a camera model, the camera model being used to adjust an angle of view of the virtual scene;
the adjusting the first viewing angle to a second viewing angle based on the viewing angle adjustment operation includes:
based on the viewing angle adjustment operation, the camera model is adjusted from a first viewing position for viewing the virtual scene at a first viewing angle to a second viewing position for viewing the virtual scene at a second viewing angle.
4. The method of claim 2, wherein the virtual scene corresponds to a camera model, wherein the first viewing angle corresponds to a first camera model and the second viewing angle corresponds to a second camera model;
the adjusting the first viewing angle to a second viewing angle based on the viewing angle adjustment operation includes:
Based on the view angle adjustment operation, the virtual scene is switched from being observed by the first camera model to being observed by the second camera model.
5. The method of claim 2, wherein the virtual scene corresponds to a camera model;
the step of keeping the position of the matting object in the virtual scene unchanged, so as to display the matting object in a direction facing the second observation angle, includes:
maintaining the position of the matting object in the virtual scene unchanged;
and controlling the matting object to face the camera model corresponding to the second observation visual angle to display.
6. The method according to any one of claims 1 to 5, wherein the method is applied to a first terminal configured with a camera;
the obtaining a first scene image in response to a virtual scene display operation includes:
and responding to the virtual scene display operation, acquiring the first scene image by the camera, wherein the first scene image is an image acquired within the shooting range of the camera.
7. The method according to any one of claims 1 to 5, wherein an angle adjustment control is displayed on the virtual environment screen, and the angle adjustment operation is an operation received on the angle adjustment control;
The adjusting the first observation angle for observing the virtual scene and the matting object to a second observation angle based on the viewing angle adjustment operation includes:
determining an adjustment mode corresponding to the visual angle adjustment operation on the angle adjustment control;
and adjusting the first observation angle for observing the virtual scene to the second observation angle in the adjustment mode.
8. The method of claim 7, wherein the angle adjustment control comprises a direction adjustment area;
the receiving viewing angle adjustment operation includes:
receiving a direction adjustment operation on the direction adjustment area;
the adjusting the first observation angle for observing the virtual scene to the second observation angle in the adjusting manner includes:
and adjusting the first observation angle for observing the virtual scene to the second observation angle based on the adjustment direction corresponding to the direction adjustment operation.
9. The method of claim 7, wherein the angle adjustment control comprises a distance adjustment area;
the method further comprises the steps of:
receiving a distance adjustment operation on the distance adjustment area, the distance adjustment operation being for adjusting a position of the observation point in the observation direction;
And adjusting the observation distance for observing the virtual scene based on the distance adjustment operation.
10. The method according to any one of claims 1 to 5, wherein displaying the virtual environment screen includes:
transmitting the first scene image to a server;
receiving picture display data fed back by a server, wherein the picture display data comprises scene data corresponding to the virtual scene and object data corresponding to the matting object;
the virtual environment screen is displayed based on the scene data and the object data.
11. The method according to any one of claims 1 to 5, further comprising, before displaying the virtual environment screen:
displaying a calibration picture, wherein the calibration picture comprises the first scene image, the first scene image comprises an indication frame and an indication line, the indication frame is used for carrying out frame selection indication on the first object, and the indication line is positioned at a designated position of the first scene image and divides the first scene image into a first area and a second area;
and indicating the background part of the first scene image by acquiring a phase image of the first object moving from a first position to a second position, wherein the first position is the position of the indication frame in the first area, and the second position is the position of the indication frame in the second area.
12. The method according to any one of claims 1 to 5, further comprising, after the displaying the virtual environment screen:
and responding to the first object and the second object which accord with the interaction requirement in the virtual scene, and displaying the interaction animation through the virtual environment picture.
13. The method according to any one of claims 1 to 5, wherein,
the matting object further comprises a second object, the second object is obtained by matting a second scene image, and the second scene image is acquired through a second terminal.
14. An interactive viewing device based on a virtual scene, the device comprising:
the acquisition module is used for responding to the virtual scene display operation to acquire a first scene image, wherein the first scene image comprises a first object;
the display module is used for displaying a virtual environment picture, wherein the virtual environment picture comprises a virtual scene and a matting object which are observed at a first observation angle, and the matting object comprises a first object obtained by matting the first scene image;
the receiving module is used for receiving the visual angle adjustment operation;
The display module is further configured to adjust a first observation angle at which the virtual scene and the matting object are observed to a second observation angle based on the viewing angle adjustment operation, where the first observation angle and the second observation angle are angles at which the virtual scene is observed from different observation positions or different observation distances.
15. A computer device comprising a processor and a memory having stored therein at least one instruction, at least one program, code set, or instruction set that is loaded and executed by the processor to implement the virtual scene-based interactive viewing method of any of claims 1-13.
16. A computer readable storage medium having stored therein at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the at least one program, the code set, or instruction set being loaded and executed by a processor to implement the virtual scene-based interactive viewing method of any of claims 1-13.
CN202311135567.5A 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene Pending CN117085322A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311135567.5A CN117085322A (en) 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202311135567.5A CN117085322A (en) 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene
CN202110702790.8A CN113274729B (en) 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202110702790.8A Division CN113274729B (en) 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene

Publications (1)

Publication Number Publication Date
CN117085322A true CN117085322A (en) 2023-11-21

Family

ID=77285349

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110702790.8A Active CN113274729B (en) 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene
CN202311135567.5A Pending CN117085322A (en) 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202110702790.8A Active CN113274729B (en) 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene

Country Status (1)

Country Link
CN (2) CN113274729B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113769397B (en) * 2021-09-28 2024-03-22 腾讯科技(深圳)有限公司 Virtual object setting method, device, equipment, medium and program product
CN114745598B (en) * 2022-04-12 2024-03-19 北京字跳网络技术有限公司 Video data display method and device, electronic equipment and storage medium
CN115048017B (en) * 2022-07-28 2023-10-17 广东伟达智能装备股份有限公司 Control method for synchronizing simulated grabbing and placing box and live-action in 3D control system

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5870771A (en) * 1996-11-15 1999-02-09 Oberg; Larry B. Computerized system for selecting, adjusting, and previewing framing product combinations for artwork and other items to be framed
US20140063061A1 (en) * 2011-08-26 2014-03-06 Reincloud Corporation Determining a position of an item in a virtual augmented space
CN106293557B (en) * 2015-05-04 2019-12-03 北京智谷睿拓技术服务有限公司 Display control method and device
CN205987196U (en) * 2016-08-26 2017-02-22 万象三维视觉科技(北京)有限公司 Bore hole 3D virtual reality display system
CN106385576B (en) * 2016-09-07 2017-12-08 深圳超多维科技有限公司 Stereoscopic Virtual Reality live broadcasting method, device and electronic equipment
CN107230182B (en) * 2017-08-03 2021-11-09 腾讯科技(深圳)有限公司 Image processing method and device and storage medium
CN108762508A (en) * 2018-05-31 2018-11-06 北京小马当红文化传媒有限公司 A kind of human body and virtual thermal system system and method for experiencing cabin based on VR
CN110465073A (en) * 2019-08-08 2019-11-19 腾讯科技(深圳)有限公司 Method, apparatus, equipment and the readable storage medium storing program for executing that visual angle adjusts in virtual environment
CN110969905A (en) * 2019-11-29 2020-04-07 塔普翊海(上海)智能科技有限公司 Remote teaching interaction and teaching aid interaction system for mixed reality and interaction method thereof
CN111988534B (en) * 2020-07-23 2021-08-20 首都医科大学附属北京朝阳医院 Multi-camera-based picture splicing method and device

Also Published As

Publication number Publication date
CN113274729A (en) 2021-08-20
CN113274729B (en) 2023-08-22

Similar Documents

Publication Publication Date Title
CN108619721B (en) Distance information display method and device in virtual scene and computer equipment
CN113274729B (en) Interactive observation method, device, equipment and medium based on virtual scene
CN108710525B (en) Map display method, device, equipment and storage medium in virtual scene
CN111050189B (en) Live broadcast method, device, equipment and storage medium
CN113633973B (en) Game picture display method, device, equipment and storage medium
CN112929687B (en) Live video-based interaction method, device, equipment and storage medium
CN109803154B (en) Live broadcast method, equipment and storage medium for chess game
CN111921197B (en) Method, device, terminal and storage medium for displaying game playback picture
CN111318026B (en) Team forming method, device, equipment and storage medium for competitive game
CN112118477B (en) Virtual gift display method, device, equipment and storage medium
CN110496392B (en) Virtual object control method, device, terminal and storage medium
JP2021535806A (en) Virtual environment observation methods, devices and storage media
CN109771955B (en) Invitation request processing method, device, terminal and storage medium
CN108579075B (en) Operation request response method, device, storage medium and system
WO2021164315A1 (en) Hotspot map display method and apparatus, and computer device and readable storage medium
CN113244616B (en) Interaction method, device and equipment based on virtual scene and readable storage medium
CN112827166A (en) Card object-based interaction method and device, computer equipment and storage medium
CN110833695B (en) Service processing method, device, equipment and storage medium based on virtual scene
CN113490010A (en) Interaction method, device and equipment based on live video and storage medium
CN109806583B (en) User interface display method, device, equipment and system
WO2022237076A1 (en) Method and apparatus for controlling avatar, and device and computer-readable storage medium
CN112774185A (en) Virtual card control method, device and equipment in card virtual scene
CN112973116B (en) Virtual scene picture display method and device, computer equipment and storage medium
CN113599810B (en) Virtual object-based display control method, device, equipment and medium
CN113633970B (en) Method, device, equipment and medium for displaying action effect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination