CN113274729A - Interactive observation method, device, equipment and medium based on virtual scene - Google Patents

Interactive observation method, device, equipment and medium based on virtual scene Download PDF

Info

Publication number
CN113274729A
CN113274729A CN202110702790.8A CN202110702790A CN113274729A CN 113274729 A CN113274729 A CN 113274729A CN 202110702790 A CN202110702790 A CN 202110702790A CN 113274729 A CN113274729 A CN 113274729A
Authority
CN
China
Prior art keywords
observation
virtual scene
angle
scene
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110702790.8A
Other languages
Chinese (zh)
Other versions
CN113274729B (en
Inventor
张冲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202311135567.5A priority Critical patent/CN117085322A/en
Priority to CN202110702790.8A priority patent/CN113274729B/en
Publication of CN113274729A publication Critical patent/CN113274729A/en
Application granted granted Critical
Publication of CN113274729B publication Critical patent/CN113274729B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The application discloses an interactive observation method, device, equipment and medium based on a virtual scene, and relates to the field of virtual environments. The method comprises the following steps: acquiring a first scene image through a camera, wherein the first scene image comprises a first object; displaying a virtual environment picture, wherein the virtual environment picture comprises a virtual scene and a cutout object observed at a first observation angle, and the image object comprises a first object obtained by cutout of a first scene image and a second object obtained by cutout of a second scene image; receiving a visual angle adjusting operation; the first viewing angle is adjusted to a second viewing angle based on the viewing angle adjustment operation. When the sectional image object is displayed in the virtual scene, the observation visual angle for observing the virtual scene is adjusted through visual angle adjustment operation, so that the interactive diversity of the virtual scene is improved through the sectional image object, the angle adjustment mode for observing the virtual scene is increased, and the observation efficiency of the virtual scene is improved.

Description

Interactive observation method, device, equipment and medium based on virtual scene
Technical Field
The embodiment of the application relates to the field of virtual environments, in particular to an interactive observation method, device, equipment and medium based on a virtual scene.
Background
The application program based on the virtual scene is generally a program which runs based on the virtual environment after the virtual environment is built through a three-dimensional model, and when the application program runs, a player can interact with the virtual environment by controlling a virtual object to move in the virtual environment.
In the related art, when a player controls a virtual object in a virtual environment, the player may control the virtual object by touching a display screen, or input a control signal through an external input device, and the virtual object moves in the virtual environment according to the control of the player.
However, in the interactive observation process implemented by the above method, the virtual object is usually observed by switching between the first person named viewing angle and the third person named viewing angle, and the viewing angle direction in the observation process follows the viewing direction of the virtual object, the observation result shows the result in the viewing angle of the virtual object, and the observation mode is relatively single.
Disclosure of Invention
The embodiment of the application provides an interactive observation method, device, equipment and medium based on a virtual scene, and the efficiency of observing a virtual environment can be improved. The technical scheme is as follows:
in one aspect, an interactive observation method based on a virtual scene is provided, and is applied to a first terminal configured with a camera, and the method includes:
acquiring a first scene image through the camera in response to a virtual scene display operation, wherein the first scene image comprises a first object located in a camera shooting range of the first terminal;
displaying a virtual environment picture, wherein the virtual environment picture comprises a virtual scene and a cutout object observed at a first observation angle, the cutout object comprises a first object obtained by cutout of the first scene image and a second object obtained by cutout of the second scene image, and the second scene image is an image acquired by a second terminal provided with a camera;
receiving a visual angle adjusting operation;
based on the visual angle adjusting operation, adjusting a first observation angle for observing the virtual scene and the sectional object to be a second observation angle, wherein the first observation angle and the second observation angle are angles for observing the virtual scene from different observation positions or different observation distances.
In another aspect, an interactive viewing device based on a virtual scene is provided, and is applied to a first terminal configured with a camera, the device includes:
the acquisition module is used for responding to virtual scene display operation and acquiring a first scene image through the camera, wherein the first scene image comprises a first object positioned in the shooting range of the camera of the first terminal;
the system comprises a display module, a processing module and a display module, wherein the display module is used for displaying a virtual environment picture, the virtual environment picture comprises a virtual scene and a cutout object which are observed at a first observation angle, the cutout object comprises a first object which is obtained by cutout of a first scene image and a second object which is obtained by cutout of a second scene image, and the second scene image is an image which is acquired by a second terminal provided with a camera;
the receiving module is used for receiving the visual angle adjusting operation;
the display module is further configured to adjust a first observation angle of the virtual scene and the matting object to a second observation angle based on the viewing angle adjustment operation, where the first observation angle and the second observation angle are angles observed from different observation positions or different observation distances in the virtual scene.
In another aspect, a computer device is provided, which includes a processor and a memory, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the memory, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the virtual scene based interactive viewing method according to any of the embodiments of the present application.
In another aspect, a computer-readable storage medium is provided, in which at least one instruction, at least one program, a set of codes, or a set of instructions is stored, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by a processor to implement the virtual scene based interactive viewing method according to any of the embodiments of the present application.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to make the computer device execute the interactive observation method based on the virtual scene in any one of the above embodiments.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
at first, the cutout object corresponding to the player is used for the virtual scene to be combined for display, when the cutout object is displayed in the virtual scene, the observation visual angle for observing the virtual scene is adjusted through visual angle adjusting operation, the visual angle adjusting operation can be adjusted on the observation position and can also be adjusted on the observation distance, when the interactive diversity of the virtual scene is improved through the cutout object, the angle adjusting mode for observing the virtual scene is increased, and the observation efficiency of the virtual scene is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of a matting object generation process provided by an exemplary embodiment of the present application;
FIG. 2 is a block diagram of an electronic device provided in an exemplary embodiment of the present application;
FIG. 3 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 4 is an interface diagram of an interactive viewing method based on a virtual scene according to an exemplary embodiment of the present application;
FIG. 5 is a schematic view of a viewing angle adjustment provided based on the embodiment shown in FIG. 4;
FIG. 6 is a general schematic diagram of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 7 is a general schematic diagram of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 8 is a flowchart of an interactive viewing method based on a virtual scene according to another exemplary embodiment of the present application;
FIG. 9 is a schematic diagram of a viewing angle change process provided based on the embodiment shown in FIG. 8;
FIG. 10 is a flowchart of an interactive viewing method based on a virtual scene according to another exemplary embodiment of the present application;
FIG. 11 is a schematic view of the directional adjustment provided based on the embodiment shown in FIG. 10;
FIG. 12 is a flowchart illustrating an interactive viewing method according to an exemplary embodiment of the present application;
FIG. 13 is a block diagram illustrating an interactive viewing device based on a virtual scene according to an exemplary embodiment of the present application;
fig. 14 is a block diagram illustrating a structure of an interactive viewing device based on a virtual scene according to another exemplary embodiment of the present application;
fig. 15 is a block diagram of a terminal according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, terms referred to in the embodiments of the present application are briefly described:
virtual environment: is a virtual environment that is displayed (or provided) when an application is run on the terminal. The virtual environment may be a simulation environment of a real world, a semi-simulation semi-fictional environment, or a pure fictional environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application. The following embodiments are illustrated with the virtual environment being a three-dimensional virtual environment. In the embodiment of the present application, the virtual environment is also referred to as a virtual scene.
Matting the object: the method refers to a designated object obtained by matting from a scene image after the scene image is acquired by a live-action camera. In the present embodiment, a cutout object obtained by matting an image in a scene image is taken as an example for illustration. Referring to fig. 1, which is a schematic diagram illustrating a process of generating a matte object according to an exemplary embodiment of the present application, as shown in fig. 1, a scene image 110 is obtained by performing image acquisition on a scene through a live-action camera 100, where the image acquisition range of the live-action camera 100 includes a character 120, so that the scene image 110 includes a corresponding object 121, and the object 121 is matte from the scene image 110 to obtain a matte object 122.
In the embodiment of the application, the virtual scene and the cutout object positioned in the virtual scene are displayed in the virtual environment picture, so that the interactive experience of a player in the virtual scene is created.
The terminal in the present application may be a desktop computer, a laptop computer, a mobile phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4) player, and so on. The terminal is installed and operated with an application program supporting a virtual environment, such as an application program supporting a three-dimensional virtual environment. The application program may be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a Third-Person shooter game (FPS), a First-Person shooter game (FPS), and a Multiplayer Online Battle sports game (MOBA). Alternatively, the application program may be a stand-alone application program, such as a stand-alone three-dimensional game program, or may be a network online application program.
Fig. 2 shows a block diagram of an electronic device according to an exemplary embodiment of the present application. The electronic device 200 includes: an operating system 220 and application programs 222.
Operating system 220 is the base software that provides applications 222 with secure access to computer hardware.
Application 222 is an application that supports a virtual environment. Optionally, application 222 is an application that supports a three-dimensional virtual environment. The application 222 may be any one of a virtual reality application, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, an MOBA game, and a multi-player gunfight type live game. The application 222 may be a stand-alone application, such as a stand-alone three-dimensional game program, or an online application.
Fig. 3 shows a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 300 includes: a first device 320, a server 340, and a second device 360.
The first device 320 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, an MOBA game and a multi-player gunfight living game. The first device 320 is a device used by a first user, and the first user uses the first device 320 to control the activity of a first cutout object located in the virtual environment, wherein the first device 320 is configured with a first camera, and after the first camera acquires and cuts out an image of the first user or a user within other image acquisition range, the first cutout object is displayed in the virtual environment.
The first device 320 is connected to the server 340 through a wireless network or a wired network.
The server 340 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 340 is used for providing background services for applications supporting a three-dimensional virtual environment. Alternatively, server 340 undertakes primary computing work and first device 320 and second device 360 undertakes secondary computing work; alternatively, the server 340 undertakes secondary computing work and the first device 320 and the second device 360 undertake primary computing work; alternatively, the server 340, the first device 320, and the second device 360 perform cooperative computing by using a distributed computing architecture.
The second device 360 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, an FPS game, an MOBA game and a multi-player gun battle type survival game. The second device 360 is a device used by a second user, the second user uses the second device 360 to control the second matting object activity in the virtual environment, wherein the second device 360 is configured with a second camera, and after the second camera acquires images and matting the images of the second user or users in other image acquisition ranges, the second matting object is displayed in the virtual environment.
Optionally, the first matte object and the second matte object are in the same virtual environment. Optionally, the first matte object and the second matte object may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights. Alternatively, the first matte object and the second matte object may belong to different teams, different organizations, or two groups with enemies.
Alternatively, the applications installed on the first device 320 and the second device 360 are the same, or the applications installed on the two devices are the same type of application for different control system platforms. The first device 320 may generally refer to one of a plurality of devices, and the second device 360 may generally refer to one of a plurality of devices, and this embodiment is illustrated by the first device 320 and the second device 360 only. The device types of the first device 320 and the second device 360 are the same or different, and include: at least one of a game console, a desktop computer, a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, and a laptop portable computer. The following embodiments are illustrated where the device is a desktop computer.
Those skilled in the art will appreciate that the number of devices described above may be greater or fewer. For example, the number of the devices may be only one, or several tens or hundreds, or more. The number and the type of the devices are not limited in the embodiments of the present application.
It should be noted that the server 340 may be implemented as a physical server, or may also be implemented as a Cloud server in a Cloud, where Cloud technology (Cloud technology) refers to a hosting technology for unifying serial resources such as hardware, software, and network in a wide area network or a local area network to implement calculation, storage, processing, and sharing of data. The cloud technology is based on the general names of network technology, information technology, integration technology, management platform technology, application technology and the like applied in the cloud computing business model, can form a resource pool, is used as required, and is flexible and convenient. Cloud computing technology will become an important support. Background services of the technical network system require a large amount of computing and storage resources, such as video websites, picture-like websites and more web portals. With the high development and application of the internet industry, each article may have its own identification mark and needs to be transmitted to a background system for logic processing, data in different levels are processed separately, and various industrial data need strong system background support and can only be realized through cloud computing.
In some embodiments, the method provided by the embodiment of the application can be applied to a cloud game scene, so that the data logic calculation in the game process is completed through the cloud server, and the terminal is responsible for displaying the game interface.
In some embodiments, the server 340 may also be implemented as a node in a blockchain system. The Blockchain (Blockchain) is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. The block chain, which is essentially a decentralized database, is a string of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, which is used for verifying the validity (anti-counterfeiting) of the information and generating a next block. The blockchain may include a blockchain underlying platform, a platform product services layer, and an application services layer.
The application scenario of the embodiment of the application includes at least one of the following scenarios:
the first one is applied to a game scene, wherein the game can be implemented as a cloud game, that is, a cloud server completes the computing logic in the game process, and a terminal is used for completing the display logic in the game process.
Illustratively, the game may be implemented as at least one of a dance game, a shooting game, and a puzzle game. A player A acquires a scene image through a first terminal provided with a first camera, and a cutout object a corresponding to the player A is obtained from the scene image through cutout; and the player B acquires a scene image through a second terminal provided with a second camera, and obtains a cutout object B corresponding to the player B from the scene image. And displaying the cutout object a, the cutout object B and a preset virtual scene in a terminal interface, so that the process that the player A and the player B interact and participate in the game in the virtual scene is realized.
And secondly, the method is applied to a live broadcast scene, wherein the live broadcast application program comprises a main broadcast and audiences, the main broadcast refers to a user for creating a live broadcast room, the audiences refer to users for watching the live broadcast room, and in the live broadcast room, the audiences can interact with the audiences in a virtual scene, or the main broadcast can interact with the audiences in the virtual scene.
Illustratively, a anchor 1 creates a virtual scene interaction activity in a live broadcast room and invites audiences to participate, the audiences 2 are invited to participate in the virtual scene interaction activity together with the anchor 1, the anchor 1 acquires a scene image through a first terminal configured with a first camera and obtains a cutout object m corresponding to the anchor 1 from the scene image by cutout; and the audience 2 acquires a scene image through a second terminal provided with a second camera, and a cutout object n corresponding to the audience 2 is obtained from the scene image by cutout. And displaying the cutout object m, the cutout object n and a preset virtual scene in a terminal interface, so that interaction between the anchor 1 and the audience 2 in the virtual scene is realized. Wherein, other audiences except audience 2 can watch the interactive process of the main broadcasting 1 and audience 2 in the virtual scene.
With reference to the above noun introduction and description of implementation environment, a description is given to an interactive observation method based on a virtual scene provided in an embodiment of the present application, please refer to fig. 4, which shows a flowchart of the interactive observation method based on a virtual scene provided in an exemplary embodiment of the present application, and the method is described by taking as an example application in a first terminal configured with a camera, as shown in fig. 4, the method includes:
step 401, a first scene image is acquired by a camera in response to a virtual scene display operation.
In some embodiments, the virtual scene display operation refers to an operation in which a user instructs to open a virtual scene.
Illustratively, for different application scenes, the implementation manner of the virtual scene display operation includes at least one of the following manners:
firstly, aiming at a cloud game application scene, a user triggers a game starting operation to serve as a virtual scene display operation, namely, a cloud game is started to play according to the game starting operation and enters a game interface. The user can enter the cloud game match with the friend group, and the friend group can be invited after the cloud game match is entered.
Secondly, aiming at the live application scene, the anchor account triggers an interactive space opening operation as a virtual scene display operation, namely, the anchor account is opened according to the interactive space opening operation to carry out the interactive virtual scene with the audience account.
The first scene image comprises a first object located in a shooting range of a camera of the first terminal. That is, the first scene image refers to an image captured by a camera configured in the first terminal, where the first terminal continuously captures the first scene image in the form of a video stream, and during the capturing process of the first scene image, the first object is within the shooting range of the camera and is displayed in the first scene image.
In some embodiments, when the terminal receives the virtual scene display operation, the camera is turned on and the first scene image is collected by the camera.
After the first scene image is collected, the first scene image needs to be subjected to matting processing, and a first object is obtained from the first scene image by matting. The process of matting the first scene image can be finished by a terminal or a server, wherein when the process of matting the first scene image is finished by the terminal, the first scene image is directly subjected to matting processing after being acquired by a camera, so that the data interaction amount between the terminal and the server is saved; when the process of matting the first scene image is completed by the server, the first scene image is sent to the server after being acquired by the camera, and the server performs matting on the first scene image to obtain a first object.
In some embodiments, the camera configured for the first terminal is a 2-dimensional camera, that is, is used for acquiring a planar image. Or, the camera configured by the first terminal is a 3-dimensional camera, that is, the depth of field information is collected in the process of collecting the image.
When the camera configured by the first terminal is a 2-dimensional camera, data interaction between the terminal and the server is performed on the basis of a plane image, so that data interaction amount is reduced; when the camera configured at the first terminal is a 3-dimensional camera, the terminal can construct a three-dimensional model corresponding to a player according to the collected depth of field information, and the authenticity of a cutout object displayed in a virtual scene is improved.
Step 402, displaying a virtual environment picture, wherein the virtual environment picture comprises a virtual scene and a matting object observed at a first observation angle.
The cutout object comprises a first object obtained by cutout of a first scene image and a second object obtained by cutout of a second scene image, and the second scene image is an image acquired by a second terminal provided with a camera.
The first object and the second object are objects participating in the same virtual game together, and it should be noted that in this embodiment, the virtual scene includes the first object and the second object as an example for description, in some optional embodiments, the virtual scene may also include only one object, or three or more objects, which is not limited in this embodiment.
In this embodiment, taking the example that the server performs cutout on the scene image as an example to describe, the terminal sends the first scene image to the server and receives the picture display data fed back by the server, where the picture display data includes the scene data corresponding to the virtual scene and the object data corresponding to the cutout object.
The scene data of the virtual scene is data corresponding to the virtual scene determined according to a preset virtual scene; or the scene data is the data corresponding to the selection result determined according to the selection of the virtual scene by the user, so that a scheme for selecting the virtual scene by the user is provided, and the diversity in the interaction of the virtual scene is increased; alternatively, the scene data is data corresponding to a random result determined from a virtual scene obtained at random. This embodiment is not limited thereto.
And the terminal displays the virtual environment picture based on the scene data and the object data fed back by the server.
In some embodiments, the display positions of the first object and the second object in the virtual scene are randomly determined among preset candidate positions, thereby increasing the uncertainty of the object display; or the display positions of the first object and the second object in the virtual scene are indicated to the terminal by the server according to a preset display rule. Illustratively, if the object display position is included in the object data, the virtual scene is displayed based on the scene data, and the matting object is displayed at the object display position in the virtual scene.
The current virtual environment picture is a picture obtained by observing a virtual scene at a first observation angle, wherein the first observation angle refers to an observation angle of a camera model for observing the virtual scene.
Optionally, each player corresponds to a camera model for observing the virtual scene in the virtual scene; or each player corresponds to n camera models for observing the virtual scene in the virtual scene for switching use; or, the players who meet the observation switching conditions correspond to n camera models for observing the virtual scene in the virtual scene for switching use, while the players who do not meet the observation switching conditions correspond to one camera model for observing the virtual scene in the virtual environment, wherein n is an integer greater than 1.
In some embodiments, registration is also needed before the virtual environment picture is displayed, so that the object is distinguished from the image background, and the matting accuracy of the object is improved. Optionally, a calibration picture is displayed, the calibration picture includes a first scene image, and the first scene image includes an indication frame and an indication line. The indication frame is used for carrying out frame selection indication on the first object, and the indication line is located at the specified position of the first scene image and divides the first scene image into a first area and a second area. A background portion of the first scene image is indicated by capturing a phased image of the first object moving from a first position to a second position. The first position is the position of the indicating frame in the first area, and the second position is the position of the indicating frame in the second area. That is, the indication frame is controlled to move from the first area of the first scene image to the second area of the first scene image, so that when the indication frame is displayed in the first area, the content displayed in the second area is a complete background image, when the indication frame is displayed in the second area, the content displayed in the first area is a complete background image, the two complete background images are combined, the complete background image except the first object in the first scene image can be obtained, and when the first object in the first scene image is subjected to image matting subsequently, the image matting process can be realized according to the identified background image, and the accuracy of the image matting result is improved.
In some embodiments, in response to the registration process of the first object and the second object being completed, the virtual environment picture is displayed, the server performs a matting process on the first object in the first scene image according to the first background region after the first object is registered, and similarly, the server performs a matting process on the second object in the second scene image according to the second background region after the second object is registered, so as to display the first object and the second object in the virtual scene.
In step 403, a viewing angle adjustment operation is received.
In some embodiments, a player-triggered perspective adjustment operation is received in response to the player meeting the observation switch condition. The implementation mode of the visual angle adjusting operation comprises at least one of the following modes:
first, through external input devices such as: the keyboard, the mouse, and the like receive a key-in operation, and determine the key-in operation corresponding to the viewing angle adjustment function as a viewing angle adjustment operation.
The implementation of the visual angle adjusting operation can be accurately and quickly realized through the external input equipment, and the implementation accuracy of the visual angle adjusting operation is improved.
Secondly, an angle adjustment control is displayed on the virtual environment picture, and the visual angle adjustment operation is the operation received on the angle adjustment control.
The angle adjustment control can accurately and quickly implement the execution of the angle adjustment operation, and the execution accuracy of the angle adjustment operation is improved.
Thirdly, the visual angle is controlled through voice input, and then voice input operation is received as visual angle adjusting operation.
Because the virtual environment related to the embodiment of the application needs to be displayed in combination with the player, namely, the action of the player is recorded in the game process, the condition that the external input device or the intelligent device is not convenient to operate exists, the visual angle is controlled in a voice input mode, the player does not need to additionally operate the external input device or the intelligent device, and the visual angle control efficiency is improved.
And fourthly, an angle adjusting control is displayed in the virtual environment picture, and a player puts out a corresponding posture in a real scene, so that when the cutout is performed by a server and displayed in the virtual environment picture, the cutout object is contacted with the angle adjusting control through actions executed by the player, and the operation of the contact between the cutout object and the angle adjusting control is taken as the corresponding angle adjusting operation. Illustratively, referring to fig. 5, after a player makes a gesture in a displayed scene, the server performs matting on a scene image to obtain a matting object 510 corresponding to the player, and when the matting object 510 is mapped to the virtual environment picture 500, a foot 511 of the matting object coincides with an angle adjustment control 520 in the virtual environment picture 500 as a contact operation with the angle adjustment control 520, so as to determine an angle adjustment operation.
The virtual environment related to the embodiment of the application needs to be displayed in combination with the player, namely, the action of the player is recorded in the game process, and the angle adjustment is realized by the contact of the action executed by the player and the angle adjustment control, so that the interaction diversity of the player and the virtual scene is increased, and the interaction capacity of the virtual scene is enhanced.
It should be noted that the implementation manner of the above-mentioned viewing angle adjusting operation is only an illustrative example, and the present application embodiment does not limit this.
And step 404, adjusting the first observation angle for observing the virtual scene and the matting object to be a second observation angle based on the visual angle adjusting operation.
Optionally, the first viewing angle and the second viewing angle are angles from which the virtual scene is viewed from different viewing positions or different viewing distances.
Wherein, the adjusting mode of the observation angle corresponds to the visual angle adjusting operation; alternatively, the adjustment mode of the observation angle is set in advance.
Illustratively, when the adjustment mode of the viewing angle corresponds to the viewing angle adjustment operation, such as: and a left adjusting control and a right adjusting control are displayed in the virtual environment picture, and when the angle adjusting control for the left adjusting control is received, the first observation angle is adjusted leftwards to obtain a second observation angle.
When the adjustment mode of the observation angle is preset, such as: and displaying a visual angle adjusting control in the virtual environment picture, and when receiving an angle adjusting operation aiming at the visual angle adjusting control, adjusting the visual angle to a preset direction to obtain a second observation angle.
Wherein the adjustment of the first viewing angle to the second viewing angle comprises a rotation adjustment in a horizontal direction in the virtual scene; or, distance adjustment in the viewing direction; or, any adjustment in the spherical direction, which is not limited in the embodiments of the present application.
Schematically, as shown in fig. 6, it shows a schematic view of the viewing angle adjustment manner provided by an exemplary embodiment of the present application. When the camera model 601 observes the virtual scene on the spherical observation model 600 in a direction toward the center of the sphere and rotates along the observation orbit 610, the adjustment from the first observation angle to the second observation angle is the rotation adjustment in the horizontal direction in the virtual scene; when the distance between the camera model 601 and the center of the sphere changes, the distance in the observation direction is adjusted; when the camera model 601 rotates arbitrarily on the spherical surface, it is adjusted arbitrarily in the spherical direction.
The rotation adjustment observation in the horizontal direction in the virtual scene is realized by performing surrounding observation on the virtual scene transversely, namely, the virtual scene is completely observed from 360 degrees transversely, so that various details in the virtual scene are displayed, and the observation efficiency is improved.
The distance adjustment in the observation direction in the virtual scene is realized, namely, the distance between an observation point and a cutout object is adjusted when the virtual scene is observed, and when the cutout object is observed at a closer distance, more observation details can be obtained, so that a close-range observation effect is generated; and when observing the sectional object with farther distance, can acquire the more comprehensive information that sectional object and virtual scene combine, produce the long-range observation effect.
The adjustment of arbitrary direction in the virtual scene, realize the adjustment of all-round observation angle when observing the virtual scene promptly, demonstrate the condition of matting the object and combining with the virtual scene from different angles, increased observation style's variety.
To sum up, the interactive observation method based on the virtual scene provided by the embodiment of the application uses the cutout object corresponding to the player for the virtual scene combined display, and adjusts the observation visual angle of the observation virtual scene through the visual angle adjustment operation when the cutout object is displayed in the virtual scene, so that the interactive diversity of the virtual scene is improved through the cutout object, the angle adjustment mode for observing the virtual scene is increased, and the observation efficiency of the virtual scene is improved.
Fig. 7 is a schematic overall implementation environment diagram provided in an exemplary embodiment of the present application, and as shown in fig. 7, a player a operates a smart device 710 to participate in a game, and the smart device 710 is configured with a camera to capture an image of the player a; player B operates the smart device 720 to participate in the game, and the smart device 720 is configured with a camera to capture images of Player B. The server 740 receives the collected scene images sent by the smart devices 710 and 720, performs matting on the scene images to obtain corresponding matting objects for the players a and B, and feeds back the virtual scene and the matting objects to the viewing terminal 750 for display, wherein the smart devices 710 and 720 also belong to the viewing terminal.
In some embodiments, since the matte object is implemented as a two-dimensional image, the matte object also needs to be controlled to display the complete image content by the display direction of the matte object during the process of adjusting the viewing angle. Fig. 8 is a flowchart of an interactive viewing method based on a virtual scene according to another exemplary embodiment of the present application, which is described by way of example when the method is applied to a first terminal configured with a camera, and as shown in fig. 8, the method includes:
step 801, acquiring a first scene image through a camera in response to a virtual scene display operation.
In some embodiments, the virtual scene display operation refers to an operation in which a user instructs to open a virtual scene.
The first scene image comprises a first object located in a shooting range of a camera of the first terminal. That is, the first scene image refers to an image acquired by a camera configured at the first terminal.
In some embodiments, when the terminal receives the virtual scene display operation, the camera is turned on and the first scene image is collected by the camera.
Step 802, displaying a virtual environment picture, wherein the virtual environment picture includes a virtual scene and a matting object observed at a first observation angle.
The cutout object comprises a first object obtained by cutout of a first scene image and a second object obtained by cutout of a second scene image, and the second scene image is an image acquired by a second terminal provided with a camera.
The current virtual environment picture is a picture obtained by observing a virtual scene at a first observation angle, wherein the first observation angle refers to an observation angle of a camera model for observing the virtual scene.
Step 803, receive a view angle adjustment operation.
In some embodiments, a player-triggered perspective adjustment operation is received in response to the player meeting the observation switch condition.
And step 804, adjusting the first observation angle for observing the virtual scene to a second observation angle based on the visual angle adjusting operation.
In some embodiments, the terminal corresponds to a camera model in the virtual scene, and the camera model is used for adjusting the observation angle. The camera model is adjusted from a first viewing position for viewing the virtual scene at a first viewing angle to a second viewing position for viewing the virtual scene at a second viewing angle based on the perspective adjustment operation. In some embodiments, the first viewing position and the second viewing position are two viewing positions on the same sphere, the first viewing angle being the angle from the first viewing position towards the center of the sphere and the second viewing angle being the angle from the second viewing position towards the center of the sphere.
In some embodiments, the angular difference (or distance difference) between the first viewing angle and the second viewing angle is predetermined; or the angle difference (or distance difference) between the first observation angle and the second observation angle is determined according to the visual angle adjusting operation, so that the player can select the required observation angle in a stable change relationship, and the observation accuracy is improved; or, the second observation angle is obtained by randomly adjusting a certain angle difference (or distance difference) on the basis of the first observation angle, so that the player can randomly determine one observation angle to observe the virtual scene, and the interest of observing the virtual scene is improved. The embodiment of the present application does not limit the variation relationship between the first observation angle and the second observation angle.
That is, each player corresponds to one camera model in the virtual scene, and the virtual environment picture is acquired through the camera model, so that when the observation angle is adjusted, the observation angle can be adjusted only by rotating the observation position of the camera model.
In other embodiments, the virtual scene corresponds to a camera model, wherein the first observation angle corresponds to the first camera model, the second observation angle corresponds to the second camera model, and the observation of the virtual scene by the first camera model is switched to the observation of the virtual scene by the second camera model based on the viewing angle adjustment operation.
That is, each player corresponds to at least two camera models in the virtual scene; or, the current player corresponds to at least two camera models in the virtual scene, so that when the observation angle is adjusted, the observation angle can be adjusted only by switching between different camera models.
When the observation angle is adjusted through switching between at least two camera models, the observation angle can be adjusted by directly switching the applications among the plurality of set camera models, the rotation and movement angles of the camera models do not need to be calculated, and the calculation amount of a server is reduced.
When the single camera model is rotated and moved to adjust the observation angle, the rotation angle of the camera model can be accurately controlled, and the accuracy of observation angle switching is improved.
Step 805, keeping the position of the scratch object in the virtual scene unchanged, and displaying the scratch object in the direction facing the second observation angle.
Because the scene image is realized as the two-dimensional image, the cutout object obtained by cutout from the scene image is also realized as the two-dimensional image, if the display direction of the cutout object is not changed, the cutout object is easily displayed as a 'paper person' or an incomplete portrait when the virtual scene is observed at other visual angles. Therefore, in the embodiment of the application, the direction of the matting object facing the observation visual angle is kept at all times.
In some embodiments, when the virtual scene is observed through the camera model, the position of the matte object in the virtual scene is kept unchanged, and the matte object is controlled to be displayed facing the camera model corresponding to the second observation visual angle.
Referring to fig. 9, which is a schematic diagram illustrating a viewing angle change process provided by an exemplary embodiment of the present application, as shown in fig. 9, a virtual environment picture 910 is a picture obtained by observing a virtual scene through a camera model at a first viewing angle, the virtual environment picture 910 includes a matte object 901 and a matte object 902, and the matte object 901 and the matte object 902 are displayed in a direction facing the first viewing angle; when the visual angle adjusting operation is received, displaying a virtual environment picture 920 obtained by observing the virtual scene at a second observation angle through the camera model, wherein the virtual environment picture 920 comprises a matting object 901 and a matting object 902, and the matting object 901 and the matting object 902 are displayed in a direction facing the second observation visual angle.
That is, regardless of whether the virtual scene is viewed from a first viewing angle or from a second viewing angle, the matte object remains displayed intact in a direction facing the viewing angle. In some embodiments, after the viewing perspective is determined, an image of the virtual scene is captured at the viewing perspective, and the matting object is attached to the image of the virtual scene as a picture of the matting object located in the virtual scene.
To sum up, the interactive observation method based on the virtual scene provided by the embodiment of the application uses the cutout object corresponding to the player for the virtual scene combined display, and adjusts the observation visual angle of the observation virtual scene through the visual angle adjustment operation when the cutout object is displayed in the virtual scene, so that the interactive diversity of the virtual scene is improved through the cutout object, the angle adjustment mode for observing the virtual scene is increased, and the observation efficiency of the virtual scene is improved.
The method provided by the embodiment keeps the display position of the matting object unchanged when the virtual scene is observed by switching the visual angle, and correspondingly adjusts the display direction of the matting object, thereby ensuring that the matting object is displayed facing the observation angle, and avoiding the problem of observation distortion of the matting object caused by the fact that the matting object is realized as a two-dimensional image.
In some embodiments, the viewing angle adjustment operation includes at least two of a direction adjustment and a distance adjustment. Fig. 10 is a flowchart of an interactive viewing method based on a virtual scene according to another exemplary embodiment of the present application, which is described by way of example as being applied to a first terminal configured with a camera, and as shown in fig. 10, the method includes:
step 1001, a first scene image is acquired by a camera in response to a virtual scene display operation.
In some embodiments, the virtual scene display operation refers to an operation in which a user instructs to open a virtual scene.
The first scene image comprises a first object located in a shooting range of a camera of the first terminal. That is, the first scene image refers to an image acquired by a camera configured at the first terminal.
In some embodiments, when the terminal receives the virtual scene display operation, the camera is turned on and the first scene image is collected by the camera.
Step 1002, displaying a virtual environment picture, where the virtual environment picture includes a virtual scene and a matting object observed at a first observation angle.
The cutout object comprises a first object obtained by cutout of a first scene image and a second object obtained by cutout of a second scene image, and the second scene image is an image acquired by a second terminal provided with a camera.
The current virtual environment picture is a picture obtained by observing a virtual scene at a first observation angle, wherein the first observation angle refers to an observation angle of a camera model for observing the virtual scene.
Step 1003, receiving a viewing angle adjusting operation.
Optionally, an angle adjustment control is displayed on the virtual environment screen, and the angle adjustment operation is an operation received on the angle adjustment control.
In view of the fact that the image of the player is collected into the virtual scene to be displayed, in the embodiment of the present application, an example is described in which the matting object and the angle adjustment control are in contact with each other on the virtual environment picture to trigger the angle adjustment operation.
In some embodiments, the angular adjustment control includes a direction adjustment region; a direction adjustment operation on the direction adjustment area is received when the viewing direction needs to be adjusted. Illustratively, the direction adjusting area comprises a left adjusting control, a right adjusting control, an up adjusting control and a down adjusting control; illustratively, taking a left adjustment control as an example, when a trigger operation on the left adjustment control is received, the observation position corresponding to the first observation angle is rotated to the left by a preset angle or a random angle, that is, on the observation sphere, the observation position corresponding to the first observation angle is rotated to the left by a certain angle to a new observation position; or switching the camera model used for observing the virtual scene from the camera model at the first observation position corresponding to the first observation angle to the camera model at the second observation position rotated to the left by a certain angle.
In some embodiments, the angle adjustment control includes a distance adjustment area, and when the observation distance needs to be adjusted, a distance adjustment operation on the distance adjustment area is received, so that the distance for observing the virtual scene is adjusted based on the distance adjustment operation, wherein the distance adjustment operation is used for adjusting the position of the observation point in the observation direction. Optionally, the distance adjustment region can adjust the observation distance within a preset distance range, that is, the distance adjustment region has an adjusted upper observation distance limit and an adjusted lower observation distance limit, so that the phenomenon that the observation distance is excessively adjusted to cause distortion of the cutout object due to the two-dimensional image when the observation distance is short is avoided.
Alternatively, the observation distance upper limit and the observation distance lower limit refer to upper and lower limits of a distance between the observation position and the observation center. That is, when the distance between the observation position and the observation center reaches the minimum distance (observation distance lower limit), the observation position cannot be adjusted any more toward a position close to the observation center; when the distance between the observation position and the observation center reaches the maximum distance (the observation distance upper limit), the observation position cannot be adjusted any more to a position away from the observation center. By setting the upper limit and the lower limit of the observation distance, the detail expression when the virtual scene and the cutout object are observed is limited, the phenomenon that when the observation distance is too close, the calculated amount consumed by detail rendering is large is avoided, when the observation distance is too far, the interface content can not be clearly reflected is also avoided, and the observation efficiency is low.
And 1004, determining a corresponding adjusting mode of the angle adjusting operation on the angle adjusting control.
In some embodiments, it is determined whether the viewing angle adjustment operation is an operation for a direction adjustment region or an operation for a distance adjustment region. When the visual angle adjustment operation is an operation for adjusting the direction of the area, rotating the angle at which the virtual scene is observed; when the angle of view adjustment operation is an operation for the distance adjustment area, the angle at which the virtual scene is observed is zoomed in or zoomed out, that is, when the angle at which the virtual scene is observed is zoomed in, the virtual scene and the object in the virtual scene are displayed in an enlarged manner, and when the angle at which the virtual scene is observed is zoomed out, the virtual scene and the object in the virtual scene are displayed in a reduced manner.
Referring to fig. 11, an angle adjustment control 1110 is schematically displayed on the virtual environment screen 1100, wherein the angle adjustment control 1110 includes a direction adjustment area 1111 and a distance adjustment area 1112. When a direction adjustment operation on the direction adjustment region 1111 is received, adjusting an angle at which the virtual scene is observed in a manner of rotating the camera model; when a distance adjustment operation on the distance adjustment region 1112 is received, the angle at which the virtual scene is observed is adjusted in such a manner as to zoom in or out the camera model.
In some embodiments, the display of direction adjustment region 1111 and distance adjustment region 1112 are in a linked relationship. Illustratively, when receiving the direction adjustment operation in the direction adjustment area 1111, the viewing direction is adjusted according to the direction adjustment operation and the distance adjustment area 1112 is displayed in a linked manner, so that the viewing distance is adjusted by receiving the distance adjustment operation in the distance adjustment area 1112. Alternatively, when the observation distance is adjusted by receiving the distance adjustment operation in the distance adjustment area 1112, the direction adjustment area 1111 is displayed in an interlocked manner, and the observation direction is adjusted by the direction adjustment operation after receiving the direction adjustment operation in the direction adjustment area 1111.
Or, adjustment options including a direction option and a distance option may also be displayed on the angle adjustment control 1110, and when a selection operation on the direction option is received, the direction adjustment area 1111 is displayed, and the control may return to the display of the adjustment options through the confirmation control; when a selection operation for the distance option is received, the distance adjustment area 1112 is displayed and may return to displaying the adjustment option through the confirmation control.
In some embodiments, the operation of adjusting the viewing angle includes both the operation in the direction adjustment area and the operation on the distance adjustment area, and the adjustment manner includes the adjustment in both the direction and the distance, which requires the corresponding adjustment in both the viewing direction and the viewing distance.
Step 1005, adjusting the first observation angle for observing the virtual scene to the second observation angle in an adjusting manner.
When the adjustment mode comprises adjustment in the observation direction, rotating or switching the camera model to realize adjustment of the observation direction; that is, the observation position of a single camera model in the virtual scene is rotated or switched between a plurality of camera models, so that the observation direction is adjusted. Such as: when the observation direction is adjusted to the right, the camera model is rotated to the right around the observation center, or the first camera model is switched to the second camera model on the right side for observation.
When the adjustment mode comprises adjustment on the observation distance, the camera model is subjected to position conversion in the observation direction to realize adjustment on the observation distance; that is, in the observation direction of the camera model, the camera model is adjusted in a direction approaching the observation center, or the camera model is adjusted in a direction away from the observation center, thereby achieving adjustment of the observation distance.
When the adjustment manner includes both the adjustment in the observation direction and the adjustment in the observation distance, the observation position of the camera model in the observation direction is adjusted while the camera model is rotated or switched.
To sum up, the interactive observation method based on the virtual scene provided by the embodiment of the application uses the cutout object corresponding to the player for the virtual scene combined display, and adjusts the observation visual angle of the observation virtual scene through the visual angle adjustment operation when the cutout object is displayed in the virtual scene, so that the interactive diversity of the virtual scene is improved through the cutout object, the angle adjustment mode for observing the virtual scene is increased, and the observation efficiency of the virtual scene is improved.
The method provided by the embodiment improves the diversity of observation modes for observing the virtual scene, including the adjustment in the observation direction and the adjustment in the observation distance when the observation angle is adjusted.
Schematically, the interaction method based on the virtual scene provided by the embodiment of the application is applied to a cloud game as an example. Fig. 12 is a flowchart illustrating an overall interactive viewing method according to an exemplary embodiment of the present application, as shown in fig. 12, the process includes:
step 1201, the cloud server starts a cloud game.
And running the cloud game in the cloud server, so that the player can access the cloud server to conduct the cloud game.
At step 1202, the player logs into the lobby.
In some embodiments, multiple players log into the cloud gaming lobby using an account system.
At step 1203, the player joins the room through the lobby.
In the embodiment of the application, at least two players are included in one cloud game room, and the at least two players can interact with each other.
At step 1204, the cloud server initializes player data.
The cloud server initializes game account data corresponding to the player.
Step 1205, the cloud server creates a personal rendering camera.
The cloud server creates unique virtual cameras of the players in the game scene, and the unique virtual cameras are bound with the players one by one and used for capturing game pictures of the specified angles and transmitting the game pictures back to the specified players.
At step 1206, the cloud server creates a personal audio group.
The personal audio group is an audio group corresponding to a player and is used for delivering independent audio content to the player, and illustratively, when the player a and the player B are in the same virtual scene, the player a selects the audio a as background audio, and the player B selects the audio B as background audio, data of the audio a is transmitted to the player a, and data of the audio B is transmitted to the player B for independent playing.
Step 1207, the player establishes a connection with the cloud server and exchanges encoding and decoding information.
The cloud server encodes the game process video and then sends the encoded game process video to the terminal, and the terminal decodes the encoded video to obtain a decoded video stream for playing; the terminal acquires video streams through the camera, encodes the video streams acquired by the camera, and sends the encoded video streams to the cloud server for decoding, subsequent matting and other processing.
In step 1208, the cloud server sends the camera rendered video stream to the player, and sends the encoded video stream.
The cloud server renders video content of the player, the video content comprises a cutout object, virtual scene data, display position data and the like, the video stream is rendered according to the display data obtained through calculation, the video stream is coded and sent to the terminal, and the terminal only needs to receive the coded video stream to decode and play the video due to the fact that the computing capacity of the cloud server is far superior to that of the terminal, so that the game process of the cloud game can be displayed, the game data do not need to be calculated or processed, data processing amount of the terminal side is reduced, and the operation efficiency and speed of the cloud game are improved.
In step 1209, the player makes a data stream simulation input.
In some embodiments, the player first receives a portion of the test data for analog input, determines that the player performed a successful control operation on the cloud game when the test data input was successful, and then is able to further control the camera or other control.
The cloud server controls the player camera/operations, step 1210.
That is, the cloud server acquires the viewing angle control operation performed by the player, and adjusts the position of the camera model according to the viewing angle control operation, thereby achieving adjustment of the viewing angle.
Fig. 13 is a block diagram of a virtual scene-based interactive viewing device according to an exemplary embodiment of the present application, and as shown in fig. 13, the device is applied to a first terminal configured with a camera, and the device includes:
an acquiring module 1310, configured to acquire, by the camera, a first scene image in response to a virtual scene display operation, where the first scene image includes a first object located within a camera shooting range of the first terminal;
a display module 1320, configured to display a virtual environment picture, where the virtual environment picture includes a virtual scene and a cutout object observed at a first observation angle, and the cutout object includes a first object obtained by cutout of the first scene image and a second object obtained by cutout of a second scene image, where the second scene image is an image acquired through a second terminal configured with a camera;
a receiving module 1330 configured to receive a view angle adjusting operation;
the display module 1320 is further configured to adjust a first observation angle for observing the virtual scene and the matting object to a second observation angle based on the viewing angle adjustment operation.
In an optional embodiment, the display module 1320 is further configured to adjust a first viewing angle for viewing the virtual scene to a second viewing angle based on the viewing angle adjustment operation; keeping the position of the matte object in the virtual scene unchanged, and displaying the matte object in a direction facing the second observation angle.
In an optional embodiment, the terminal corresponds to a camera model in the virtual scene, and the camera model is used for adjusting an observation angle;
as shown in fig. 14, the apparatus further includes:
an adjusting module 1340 configured to adjust the camera model from a first viewing position to a second viewing position based on the viewing angle adjusting operation, wherein the first viewing position is used for viewing the virtual scene at a first viewing angle, and the second viewing position is used for viewing the virtual scene at a second viewing angle.
In an optional embodiment, a camera model corresponds to the virtual scene, wherein the first observation angle corresponds to a first camera model, and the second observation angle corresponds to a second camera model;
the device, still include:
an adjusting module 1340 configured to switch from observing the virtual scene through the first camera model to observing the virtual scene through the second camera model based on the view angle adjusting operation.
In an optional embodiment, a camera model is correspondingly arranged in the virtual scene;
the display module 1320, further configured to keep the position of the matte object in the virtual scene unchanged; and controlling the sectional object to face the camera model corresponding to the second observation visual angle for displaying.
In an optional embodiment, an angle adjustment control is displayed on the virtual environment picture, and the angle adjustment operation is an operation received on the angle adjustment control;
the device further comprises:
the determining module is used for determining a corresponding adjusting mode of the visual angle adjusting operation on the angle adjusting control;
the display module 1320 is further configured to adjust the first viewing angle for viewing the virtual scene to the second viewing angle in the adjusting manner.
In an alternative embodiment, the angular adjustment control includes a direction adjustment area;
the receiving module 1330 is further configured to receive a direction adjustment operation on the direction adjustment area;
the display module 1320 is further configured to adjust the first observation angle for observing the virtual scene to the second observation angle based on the adjustment direction corresponding to the direction adjustment operation.
In an alternative embodiment, the angular adjustment control includes a distance adjustment region;
the receiving module 1330 is further configured to receive a distance adjusting operation on the distance adjusting area, where the distance adjusting operation is used to adjust a position of the observation point in the observation direction;
the device, still include:
an adjusting module 1340 configured to adjust an observation distance for observing the virtual scene based on the distance adjusting operation.
To sum up, the interactive viewing device based on virtual scene that this application embodiment provided is used for the corresponding sectional drawing object of player at first to the virtual scene combines to show to when showing the sectional drawing object in the virtual scene, adjust the observation visual angle of observing the virtual scene through visual angle adjustment operation, thereby when improving the interactive variety of virtual scene through the sectional drawing object, increased the angle adjustment mode of observing the virtual scene, improved the observation efficiency of virtual scene.
It should be noted that: the interactive observation device based on a virtual scene provided in the above embodiment is only exemplified by the division of the above functional modules, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the above described functions. In addition, the interactive observation device based on the virtual scene provided by the above embodiment and the interactive observation method based on the virtual scene belong to the same concept, and specific implementation processes thereof are detailed in the method embodiment and are not described herein again.
Fig. 15 shows a block diagram of a terminal 1500 according to an exemplary embodiment of the present application. The terminal 1500 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 1500 may also be referred to as user equipment, a portable terminal, a laptop terminal, a desktop terminal, or other names.
In general, terminal 1500 includes: a processor 1501 and memory 1502.
Processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1501 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1501 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, processor 1501 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
The memory 1502 may include one or more computer-readable storage media, which may be non-transitory. The memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1502 is used to store at least one instruction for execution by processor 1501 to implement the virtual scene-based interactive viewing methods provided by method embodiments herein.
In some embodiments, the terminal 1500 may further include: a peripheral interface 1503 and at least one peripheral. The processor 1501, memory 1502, and peripheral interface 1503 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1503 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of a radio frequency circuit 1504, a display 1505, a camera 1506, an audio circuit 1507, a positioning assembly 1508, and a power supply 1509.
The peripheral interface 1503 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 1501 and the memory 1502. In some embodiments, the processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1501, the memory 1502, and the peripheral interface 1503 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1504 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 1504 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1504 can communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1504 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1505 is a touch display screen, the display screen 1505 also has the ability to capture touch signals on or over the surface of the display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. In this case, the display screen 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1505 may be one, providing the front panel of terminal 1500; in other embodiments, display 1505 may be at least two, each disposed on a different surface of terminal 1500 or in a folded design; in still other embodiments, display 1505 may be a flexible display disposed on a curved surface or a folded surface of terminal 1500. Even further, the display 1505 may be configured in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 1505 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1506 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1507 may include a microphone and speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1501 for processing or inputting the electric signals to the radio frequency circuit 1504 to realize voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of the terminal 1500. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1501 or the radio frequency circuit 1504 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1507 may also include a headphone jack.
The positioning component 1508 is used to locate the current geographic position of the terminal 1500 for navigation or LBS (Location Based Service). The Positioning component 1508 may be a Positioning component based on the united states GPS (Global Positioning System), the chinese beidou System, or the russian galileo System.
Power supply 1509 is used to power the various components in terminal 1500. The power supply 1509 may be alternating current, direct current, disposable or rechargeable. When the power supply 1509 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 1500 also includes one or more sensors 1510. The one or more sensors 1510 include, but are not limited to: acceleration sensor 1511, gyro sensor 1512, pressure sensor 1513, fingerprint sensor 1514, optical sensor 1515, and proximity sensor 1516.
The acceleration sensor 1511 may detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 1500. For example, the acceleration sensor 1511 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1501 may control the touch screen display 1505 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1511. The acceleration sensor 1511 may also be used for acquisition of motion data of a game or a user.
The gyroscope sensor 1512 can detect the body direction and the rotation angle of the terminal 1500, and the gyroscope sensor 1512 and the acceleration sensor 1511 cooperate to collect the 3D motion of the user on the terminal 1500. The processor 1501 may implement the following functions according to the data collected by the gyro sensor 1512: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1513 may be disposed on a side bezel of terminal 1500 and/or underneath touch display 1505. When the pressure sensor 1513 is disposed on the side frame of the terminal 1500, the holding signal of the user to the terminal 1500 may be detected, and the processor 1501 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1513. When the pressure sensor 1513 is disposed at a lower layer of the touch display 1505, the processor 1501 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1505. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1514 is configured to capture a fingerprint of the user, and the processor 1501 identifies the user based on the fingerprint captured by the fingerprint sensor 1514, or the fingerprint sensor 1514 identifies the user based on the captured fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1501 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 1514 may be disposed on the front, back, or side of the terminal 1500. When a physical key or vendor Logo is provided on the terminal 1500, the fingerprint sensor 1514 may be integrated with the physical key or vendor Logo.
The optical sensor 1515 is used to collect ambient light intensity. In one embodiment, processor 1501 may control the brightness of the display on touch screen 1505 based on the intensity of ambient light collected by optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1505 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1505 is turned down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 based on the ambient light intensity collected by the optical sensor 1515.
A proximity sensor 1516, also known as a distance sensor, is typically provided on the front panel of the terminal 1500. The proximity sensor 1516 is used to collect the distance between the user and the front surface of the terminal 1500. In one embodiment, when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually decreases, the processor 1501 controls the touch display 1505 to switch from the bright screen state to the dark screen state; when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually becomes larger, the processor 1501 controls the touch display 1505 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 15 does not constitute a limitation of terminal 1500, and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components may be employed.
Optionally, the computer-readable storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a Solid State Drive (SSD), or an optical disc. The Random Access Memory may include a resistive Random Access Memory (ReRAM) and a Dynamic Random Access Memory (DRAM). The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. An interactive observation method based on a virtual scene is applied to a first terminal configured with a camera, and the method comprises the following steps:
acquiring a first scene image through the camera in response to a virtual scene display operation, wherein the first scene image comprises a first object located in a camera shooting range of the first terminal;
displaying a virtual environment picture, wherein the virtual environment picture comprises a virtual scene and a cutout object observed at a first observation angle, the cutout object comprises a first object obtained by cutout of the first scene image and a second object obtained by cutout of the second scene image, and the second scene image is an image acquired by a second terminal provided with a camera;
receiving a visual angle adjusting operation;
based on the visual angle adjusting operation, adjusting a first observation angle for observing the virtual scene and the sectional object to be a second observation angle, wherein the first observation angle and the second observation angle are angles for observing the virtual scene from different observation positions or different observation distances.
2. The method of claim 1, wherein the adjusting a first viewing angle from which the virtual scene and the matte object are viewed to a second viewing angle based on the perspective adjustment operation comprises:
adjusting a first observation angle for observing the virtual scene to a second observation angle based on the visual angle adjustment operation;
keeping the position of the matte object in the virtual scene unchanged, and displaying the matte object in a direction facing the second observation angle.
3. The method according to claim 2, wherein the terminal corresponds to a camera model in the virtual scene, and the camera model is used for adjusting the observation angle;
adjusting a first viewing angle from which the virtual scene is viewed to a second viewing angle based on the viewing angle adjustment operation, comprising:
adjusting the camera model from a first viewing position for viewing the virtual scene at a first viewing angle to a second viewing position for viewing the virtual scene at a second viewing angle based on the perspective adjustment operation.
4. The method of claim 2, wherein there is a corresponding camera model in the virtual scene, wherein the first viewing angle corresponds to a first camera model and the second viewing angle corresponds to a second camera model;
adjusting a first viewing angle from which the virtual scene is viewed to a second viewing angle based on the viewing angle adjustment operation, comprising:
switching from observing the virtual scene through the first camera model to observing the virtual scene through the second camera model based on the perspective adjustment operation.
5. The method according to claim 2, wherein the virtual scene is provided with a camera model;
the keeping the position of the cutout object in the virtual scene unchanged to display the cutout object in the direction facing the second observation angle comprises:
keeping the position of the sectional object in the virtual scene unchanged;
and controlling the sectional object to face the camera model corresponding to the second observation visual angle for displaying.
6. The method according to any one of claims 1 to 5, wherein an angle adjustment control is displayed on the virtual environment screen, and the angle adjustment operation is an operation received on the angle adjustment control;
adjusting a first observation angle for observing the virtual scene and the matting object to a second observation angle based on the visual angle adjustment operation, comprising:
determining an adjusting mode corresponding to the visual angle adjusting operation on the angle adjusting control;
and adjusting the first observation angle for observing the virtual scene to the second observation angle in the adjusting mode.
7. The method of claim 6, wherein the angular adjustment control comprises a direction adjustment region;
the receive perspective adjustment operation comprising:
receiving a direction adjustment operation on the direction adjustment area;
the adjusting the first observation angle for observing the virtual scene to the second observation angle in the adjusting manner includes:
and adjusting the first observation angle for observing the virtual scene to the second observation angle based on the adjustment direction corresponding to the direction adjustment operation.
8. The method of claim 6, wherein the angle adjustment control comprises a distance adjustment region;
the method further comprises the following steps:
receiving a distance adjustment operation on the distance adjustment area, the distance adjustment operation being used to adjust a position of an observation point in an observation direction;
adjusting an observation distance for observing the virtual scene based on the distance adjustment operation.
9. The method according to any one of claims 1 to 5, wherein the displaying the virtual environment screen comprises:
sending the first scene image to a server;
receiving picture display data fed back by a server, wherein the picture display data comprises scene data corresponding to the virtual scene and object data corresponding to the matting object;
displaying the virtual environment picture based on the scene data and the object data.
10. The method according to any one of claims 1 to 5, wherein before displaying the virtual environment screen, the method further comprises:
displaying a calibration picture, wherein the calibration picture comprises the first scene image, the first scene image comprises an indication frame and an indication line, the indication frame is used for performing framing indication on the first object, and the indication line is located at a specified position of the first scene image and divides the first scene image into a first area and a second area;
indicating a background portion of the first scene image by acquiring a phase image in which the first object moves from a first position to a second position, wherein the first position is a position of the indication frame in the first area, and the second position is a position of the indication frame in the second area.
11. The method according to any one of claims 1 to 5, wherein after displaying the virtual environment screen, the method further comprises:
and responding to the first object and the second object meeting the interaction requirement in the virtual scene, and displaying the interaction animation through the virtual environment picture.
12. The utility model provides an interactive viewing device based on virtual scene which characterized in that, is applied to and disposes in the first terminal of camera, the device includes:
the acquisition module is used for responding to virtual scene display operation and acquiring a first scene image through the camera, wherein the first scene image comprises a first object positioned in the shooting range of the camera of the first terminal;
the system comprises a display module, a processing module and a display module, wherein the display module is used for displaying a virtual environment picture, the virtual environment picture comprises a virtual scene and a cutout object which are observed at a first observation angle, the cutout object comprises a first object which is obtained by cutout of a first scene image and a second object which is obtained by cutout of a second scene image, and the second scene image is an image which is acquired by a second terminal provided with a camera;
the receiving module is used for receiving the visual angle adjusting operation;
the display module is further configured to adjust a first observation angle of the virtual scene and the matting object to a second observation angle based on the viewing angle adjustment operation, where the first observation angle and the second observation angle are angles observed from different observation positions or different observation distances in the virtual scene.
13. The apparatus of claim 12, wherein the display module is further configured to adjust a first viewing angle for viewing the virtual scene to a second viewing angle based on the viewing angle adjustment operation; keeping the position of the matte object in the virtual scene unchanged, and displaying the matte object in a direction facing the second observation angle.
14. A computer device comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the method for interactive viewing based on virtual scenes according to any one of claims 1 to 11.
15. A computer-readable storage medium, wherein at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the storage medium, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by a processor to implement the method for interactive viewing based on virtual scenes according to any one of claims 1 to 11.
CN202110702790.8A 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene Active CN113274729B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202311135567.5A CN117085322A (en) 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene
CN202110702790.8A CN113274729B (en) 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110702790.8A CN113274729B (en) 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202311135567.5A Division CN117085322A (en) 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene

Publications (2)

Publication Number Publication Date
CN113274729A true CN113274729A (en) 2021-08-20
CN113274729B CN113274729B (en) 2023-08-22

Family

ID=77285349

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110702790.8A Active CN113274729B (en) 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene
CN202311135567.5A Pending CN117085322A (en) 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202311135567.5A Pending CN117085322A (en) 2021-06-24 2021-06-24 Interactive observation method, device, equipment and medium based on virtual scene

Country Status (1)

Country Link
CN (2) CN113274729B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113769397A (en) * 2021-09-28 2021-12-10 腾讯科技(深圳)有限公司 Virtual object setting method, device, equipment, medium and program product
CN114745598A (en) * 2022-04-12 2022-07-12 北京字跳网络技术有限公司 Video data display method and device, electronic equipment and storage medium
CN115048017A (en) * 2022-07-28 2022-09-13 广东伟达智能装备股份有限公司 Control method for simulating synchronization of pick-and-place box and live-action in 3D control system

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5870771A (en) * 1996-11-15 1999-02-09 Oberg; Larry B. Computerized system for selecting, adjusting, and previewing framing product combinations for artwork and other items to be framed
US20140063061A1 (en) * 2011-08-26 2014-03-06 Reincloud Corporation Determining a position of an item in a virtual augmented space
CN106293557A (en) * 2015-05-04 2017-01-04 北京智谷睿拓技术服务有限公司 Display control method and device
CN106385576A (en) * 2016-09-07 2017-02-08 深圳超多维科技有限公司 Three-dimensional virtual reality live method and device, and electronic device
CN205987196U (en) * 2016-08-26 2017-02-22 万象三维视觉科技(北京)有限公司 Bore hole 3D virtual reality display system
CN107230182A (en) * 2017-08-03 2017-10-03 腾讯科技(深圳)有限公司 A kind of processing method of image, device and storage medium
CN108762508A (en) * 2018-05-31 2018-11-06 北京小马当红文化传媒有限公司 A kind of human body and virtual thermal system system and method for experiencing cabin based on VR
CN110465073A (en) * 2019-08-08 2019-11-19 腾讯科技(深圳)有限公司 Method, apparatus, equipment and the readable storage medium storing program for executing that visual angle adjusts in virtual environment
CN110969905A (en) * 2019-11-29 2020-04-07 塔普翊海(上海)智能科技有限公司 Remote teaching interaction and teaching aid interaction system for mixed reality and interaction method thereof
CN111988534A (en) * 2020-07-23 2020-11-24 首都医科大学附属北京朝阳医院 Multi-camera-based picture splicing method and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5870771A (en) * 1996-11-15 1999-02-09 Oberg; Larry B. Computerized system for selecting, adjusting, and previewing framing product combinations for artwork and other items to be framed
US20140063061A1 (en) * 2011-08-26 2014-03-06 Reincloud Corporation Determining a position of an item in a virtual augmented space
CN106293557A (en) * 2015-05-04 2017-01-04 北京智谷睿拓技术服务有限公司 Display control method and device
CN205987196U (en) * 2016-08-26 2017-02-22 万象三维视觉科技(北京)有限公司 Bore hole 3D virtual reality display system
CN106385576A (en) * 2016-09-07 2017-02-08 深圳超多维科技有限公司 Three-dimensional virtual reality live method and device, and electronic device
CN107230182A (en) * 2017-08-03 2017-10-03 腾讯科技(深圳)有限公司 A kind of processing method of image, device and storage medium
CN108762508A (en) * 2018-05-31 2018-11-06 北京小马当红文化传媒有限公司 A kind of human body and virtual thermal system system and method for experiencing cabin based on VR
CN110465073A (en) * 2019-08-08 2019-11-19 腾讯科技(深圳)有限公司 Method, apparatus, equipment and the readable storage medium storing program for executing that visual angle adjusts in virtual environment
CN110969905A (en) * 2019-11-29 2020-04-07 塔普翊海(上海)智能科技有限公司 Remote teaching interaction and teaching aid interaction system for mixed reality and interaction method thereof
CN111988534A (en) * 2020-07-23 2020-11-24 首都医科大学附属北京朝阳医院 Multi-camera-based picture splicing method and device

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113769397A (en) * 2021-09-28 2021-12-10 腾讯科技(深圳)有限公司 Virtual object setting method, device, equipment, medium and program product
CN113769397B (en) * 2021-09-28 2024-03-22 腾讯科技(深圳)有限公司 Virtual object setting method, device, equipment, medium and program product
CN114745598A (en) * 2022-04-12 2022-07-12 北京字跳网络技术有限公司 Video data display method and device, electronic equipment and storage medium
CN114745598B (en) * 2022-04-12 2024-03-19 北京字跳网络技术有限公司 Video data display method and device, electronic equipment and storage medium
CN115048017A (en) * 2022-07-28 2022-09-13 广东伟达智能装备股份有限公司 Control method for simulating synchronization of pick-and-place box and live-action in 3D control system
CN115048017B (en) * 2022-07-28 2023-10-17 广东伟达智能装备股份有限公司 Control method for synchronizing simulated grabbing and placing box and live-action in 3D control system

Also Published As

Publication number Publication date
CN113274729B (en) 2023-08-22
CN117085322A (en) 2023-11-21

Similar Documents

Publication Publication Date Title
CN111589128B (en) Operation control display method and device based on virtual scene
CN111013142B (en) Interactive effect display method and device, computer equipment and storage medium
CN111414080B (en) Method, device and equipment for displaying position of virtual object and storage medium
CN113274729B (en) Interactive observation method, device, equipment and medium based on virtual scene
CN111921197B (en) Method, device, terminal and storage medium for displaying game playback picture
CN107982918B (en) Game game result display method and device and terminal
CN111050189B (en) Live broadcast method, device, equipment and storage medium
CN113244616B (en) Interaction method, device and equipment based on virtual scene and readable storage medium
CN111318026B (en) Team forming method, device, equipment and storage medium for competitive game
CN111603770B (en) Virtual environment picture display method, device, equipment and medium
CN110496392B (en) Virtual object control method, device, terminal and storage medium
CN111672106B (en) Virtual scene display method and device, computer equipment and storage medium
JP2021535806A (en) Virtual environment observation methods, devices and storage media
CN109771955B (en) Invitation request processing method, device, terminal and storage medium
CN108579075B (en) Operation request response method, device, storage medium and system
CN112691372A (en) Virtual item display method, device, equipment and readable storage medium
CN113230655A (en) Virtual object control method, device, equipment, system and readable storage medium
CN111273780B (en) Animation playing method, device and equipment based on virtual environment and storage medium
CN112827166A (en) Card object-based interaction method and device, computer equipment and storage medium
CN110833695B (en) Service processing method, device, equipment and storage medium based on virtual scene
CN114288654A (en) Live broadcast interaction method, device, equipment, storage medium and computer program product
CN108184130B (en) Simulator system, live broadcast method, device and storage medium
CN112774185A (en) Virtual card control method, device and equipment in card virtual scene
CN112973116B (en) Virtual scene picture display method and device, computer equipment and storage medium
CN113599810B (en) Virtual object-based display control method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40051667

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant