CN113769395B - Virtual scene interaction method and device and electronic equipment - Google Patents

Virtual scene interaction method and device and electronic equipment Download PDF

Info

Publication number
CN113769395B
CN113769395B CN202111141959.3A CN202111141959A CN113769395B CN 113769395 B CN113769395 B CN 113769395B CN 202111141959 A CN202111141959 A CN 202111141959A CN 113769395 B CN113769395 B CN 113769395B
Authority
CN
China
Prior art keywords
account
virtual scene
man
interaction interface
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111141959.3A
Other languages
Chinese (zh)
Other versions
CN113769395A (en
Inventor
文晗
王礼进
周西洋
李熠琦
陈印超
郭晨旭
张雅
张惠中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111141959.3A priority Critical patent/CN113769395B/en
Publication of CN113769395A publication Critical patent/CN113769395A/en
Application granted granted Critical
Publication of CN113769395B publication Critical patent/CN113769395B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an interaction method, an interaction device, electronic equipment, a computer readable storage medium and a computer program product of a virtual scene, and related embodiments can be applied to various scenes such as cloud technology, artificial intelligence, intelligent traffic and the like. The interaction method of the virtual scene comprises the following steps: presenting a virtual scene in a first man-machine interaction interface; the virtual scene is logged in based on a first account; responding to synchronous triggering operation for the virtual scene, and synchronizing the virtual scene to a second human-computer interaction interface corresponding to a second account; receiving interaction information sent by a second account based on a second man-machine interaction interface, and presenting the interaction information in a first man-machine interaction interface; the interaction information is used for guiding the control operation of the first account in the virtual scene. The method and the device can improve the control effect on the virtual scene and improve the actual utilization rate of the computing resources of the electronic equipment.

Description

Virtual scene interaction method and device and electronic equipment
Technical Field
The present application relates to computer technology, and in particular, to an interaction method and apparatus for a virtual scene, an electronic device, a computer readable storage medium, and a computer program product.
Background
With the gradual maturity of display technology based on graphic processing hardware, channels for sensing environment and acquiring information are expanded, particularly the display technology of virtual scenes, interaction between a user and the virtual scenes can be realized according to actual demands, the user is supported to control the virtual scenes through account numbers, and virtual objects in the virtual scenes can be controlled in virtual scenes such as athletic games.
In the scheme provided by the related art, the virtual scene is generally controlled by the user according to his own experience and technique, however, in the case that the user's experience is less or the technique is worse, the control operation performed may be ineffective or even have a negative effect, resulting in meaningless consumption of the computing resources of the electronic device for responding to the control operation, i.e., low actual utilization of the computing resources consumed by the electronic device.
Disclosure of Invention
The embodiment of the application provides an interaction method, an interaction device, electronic equipment, a computer readable storage medium and a computer program product for a virtual scene, which can improve the control effect on the virtual scene and improve the actual utilization rate of computing resources consumed by the electronic equipment.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides an interaction method of a virtual scene, which comprises the following steps:
presenting a virtual scene in a first man-machine interaction interface; the virtual scene is logged in based on a first account;
responding to a synchronous triggering operation aiming at the virtual scene, and synchronizing the virtual scene to a second human-computer interaction interface corresponding to a second account;
receiving interaction information sent by the second account based on the second man-machine interaction interface, and presenting the interaction information in the first man-machine interaction interface; the interaction information is used for guiding the control operation of the first account in the virtual scene.
The embodiment of the application provides an interaction method of a virtual scene, which comprises the following steps:
presenting the virtual scene synchronized based on the first man-machine interaction interface in a second man-machine interaction interface corresponding to the second account; the virtual scene is logged in based on a first account;
receiving interaction information aiming at the virtual scene through the second man-machine interaction interface;
the interaction information is sent to the first account, so that the first account presents the interaction information in the first man-machine interaction interface; the interaction information is used for guiding the control operation of the first account in the virtual scene.
In the above scheme, before the virtual scene synchronized based on the first man-machine interaction interface is presented in the second man-machine interaction interface corresponding to the second account, the method further includes:
presenting a plurality of candidate accounts different from the second account in the second man-machine interaction interface;
and responding to the selection operation of the plurality of candidate accounts, and taking the selected candidate account as a first account.
In the above scheme, the method further comprises:
performing at least one of the following:
taking an account with a friend relation with the second account as a candidate account;
determining a synchronous geographic range taking the geographic position of the second account as the center, and taking the account in the synchronous geographic range as a candidate account;
taking the history synchronous account number of the second account number as a candidate account number; wherein the history synchronization account represents an account of a virtual scene historically synchronized by the second account;
determining a historical operation record of the second account in the virtual scene, and taking an account with a target relation with the second account in the historical operation record as a candidate account; wherein the target relationship includes at least one of a synergistic relationship, an antagonistic relationship, and a neutral relationship;
Determining parameter conditions according to account attribute parameters of the second account in the virtual scene, and taking an account with the account attribute parameters meeting the parameter conditions as a candidate account; wherein the account attribute parameters include at least one of a level, a number of virtual resources, and a number of games.
In the above scheme, before the virtual scene synchronized based on the first man-machine interaction interface is presented in the second man-machine interaction interface corresponding to the second account, the method further includes:
screening the multiple candidate accounts according to screening parameters respectively corresponding to the multiple candidate accounts different from the second account to obtain a first account;
wherein the screening parameters include at least one of:
friend affinity with the second account;
a geographic location distance from the second account;
historical synchronization times with the second account;
the number of occurrences in the historical operating record of the second account;
account attribute parameters in the virtual scene; the account attribute parameters include at least one of a level, a number of virtual resources, and a number of games.
In the above solution, before the presenting the virtual scene synchronized based on the first man-machine interaction interface in the second man-machine interaction interface corresponding to the second account, the method further includes:
Receiving a synchronization request sent by the first account; wherein the synchronization request comprises at least one of an account identifier of the first account and a scene identifier of the virtual scene;
and responding to the confirmation operation of the synchronous request, and sending the confirmation information of the synchronous request to the first account, so that the first account synchronizes the virtual scene to the second man-machine interaction interface according to the confirmation information of the synchronous request.
In the above solution, before the presenting the virtual scene synchronized based on the first man-machine interaction interface in the second man-machine interaction interface corresponding to the second account, the method further includes:
responding to a synchronous triggering operation aiming at the virtual scene, and sending a synchronous request to the first account so that the first account can synchronize the virtual scene to the second man-machine interaction interface according to the synchronous request; the synchronization request comprises an account identifier of the second account.
In the above scheme, the method further comprises:
presenting a synchronization entry for the virtual scene in the second human-machine interaction interface;
and taking the triggering operation aiming at the synchronous entrance as the synchronous triggering operation aiming at the virtual scene.
In the above scheme, when the virtual scene synchronized based on the first man-machine interaction interface is presented in the second man-machine interaction interface corresponding to the second account, the method further includes:
establishing interactive connection between the first account and the second account;
the interactive connection is used for sending the interactive information to the first account.
In the above scheme, the establishing the interactive connection between the first account and the second account includes:
establishing interactive connection between the first account and the second account according to the target media type;
wherein the target media type represents a media type specified by a synchronization triggering operation; the synchronization triggering operation is used for triggering the synchronization of the virtual scene; the interactive connection is used for transmitting interactive information conforming to the target media type.
In the above scheme, the method further comprises:
responding to the interaction triggering operation aiming at the virtual scene, and establishing interaction connection between the first account and the second account;
the interactive connection is used for sending the interactive information to the first account.
In the above solution, the establishing the interactive connection between the first account and the second account in response to the interactive triggering operation for the virtual scene includes:
Presenting an interaction inlet for the virtual scene in the second man-machine interaction interface;
and taking the triggering operation aiming at the interaction entrance as the interaction triggering operation aiming at the virtual scene.
In the above solution, the establishing the interactive connection between the first account and the second account in response to the interactive triggering operation for the virtual scene includes:
responding to the interaction triggering operation aiming at the virtual scene, and sending a connection request to the first account; the connection request comprises an account identifier of the second account;
and when receiving the confirmation information of the first account to the connection request, establishing interactive connection between the first account and the second account.
In the above scheme, the method further comprises:
receiving a connection request sent by the first account based on the first human-computer interaction interface; the connection request comprises at least one of an account identifier of the first account and a scene identifier of the virtual scene;
and taking the confirmation operation aiming at the connection request as the interaction triggering operation aiming at the virtual scene.
In the above scheme, the establishing the interactive connection between the first account and the second account includes:
Establishing interactive connection between the first account and the second account according to the target media type;
wherein the target media type represents the media type specified by the interactive triggering operation; the interactive connection is used for transmitting interactive information conforming to the target media type.
In the above scheme, the method further comprises:
presenting at least one of an interactive connection state, the interactive information, an ending interactive portal and an ending synchronous portal in the second man-machine interactive interface;
the interactive connection is used for sending the interactive information to the first account; the ending interaction entrance is used for disconnecting the interaction connection when triggered; and the end synchronization entry is used for disconnecting the interactive connection when triggered and stopping presenting the virtual scene in the second man-machine interaction interface.
In the above scheme, before the virtual scene synchronized based on the first man-machine interaction interface is presented in the second man-machine interaction interface corresponding to the second account, the method further includes:
and sending the synchronous region parameters to the first account so that the first account synchronizes the synchronous region corresponding to the synchronous region parameters in the virtual scene to the second man-machine interaction interface.
In the above scheme, the method further comprises:
and receiving the consultation information sent by the first account, and presenting the consultation information in the second man-machine interaction interface.
The embodiment of the application provides an interaction device of a virtual scene, which comprises:
the scene presenting module is used for presenting the virtual scene in the first man-machine interaction interface; the virtual scene is logged in based on a first account;
the scene synchronization module is used for responding to the synchronization triggering operation aiming at the virtual scene and synchronizing the virtual scene to a second human-computer interaction interface corresponding to a second account;
the information presentation module is used for receiving the interaction information sent by the second account based on the second man-machine interaction interface and presenting the interaction information in the first man-machine interaction interface; the interaction information is used for guiding the control operation of the first account in the virtual scene.
The embodiment of the application provides an interaction device of a virtual scene, which comprises:
the synchronous presenting module is used for presenting the virtual scene based on the synchronization of the first man-machine interaction interface in the second man-machine interaction interface corresponding to the second account; the virtual scene is logged in based on a first account;
The receiving module is used for receiving interaction information aiming at the virtual scene through the second man-machine interaction interface;
the sending module is used for sending the interaction information to the first account so that the first account presents the interaction information in the first man-machine interaction interface; the interaction information is used for guiding the control operation of the first account in the virtual scene.
An embodiment of the present application provides an electronic device, including:
a memory for storing executable instructions;
and the processor is used for realizing the interaction method of the virtual scene provided by the embodiment of the application when executing the executable instructions stored in the memory.
The embodiment of the application provides a computer readable storage medium which stores executable instructions for realizing the interaction method of the virtual scene provided by the embodiment of the application when being executed by a processor.
The embodiment of the application provides a computer program product, which comprises executable instructions, wherein the executable instructions realize the interaction method of the virtual scene provided by the embodiment of the application when being executed by a processor.
The embodiment of the application has the following beneficial effects:
the virtual scene presented by the first man-machine interaction interface is synchronized to a second man-machine interaction interface corresponding to the second account, and the second account is presented in the first man-machine interaction interface based on interaction information sent by the second man-machine interaction interface, so that effective control operation can be implemented on the virtual scene by means of experience and technology of the second account on virtual scene control, control effect on the virtual scene can be improved, and meanwhile, actual utilization rate of computing resources consumed by the electronic equipment can be improved.
Drawings
FIG. 1A is a first schematic illustration of a game course provided by an embodiment of the present application;
FIG. 1B is a second schematic illustration of a game course provided by an embodiment of the present application;
FIG. 1C is a third schematic illustration of a game course provided by an embodiment of the present application;
FIG. 1D is a fourth schematic illustration of a game course provided by an embodiment of the present application;
FIG. 1E is a fifth schematic illustration of a game course provided by an embodiment of the present application;
FIG. 1F is a sixth schematic illustration of a game course provided by an embodiment of the present application;
FIG. 2 is a schematic diagram of an interactive system of a virtual scene according to an embodiment of the present application;
fig. 3A is a schematic diagram of a first architecture of a terminal device according to an embodiment of the present application;
fig. 3B is a schematic diagram of a second architecture of a terminal device according to an embodiment of the present application;
FIG. 4 is a first flow chart of an interaction method of virtual scenes according to an embodiment of the present application;
FIG. 5 is a second flow chart of an interaction method of a virtual scene according to an embodiment of the present application;
FIG. 6 is a third flow chart of an interaction method of a virtual scene according to an embodiment of the present application;
FIG. 7A is a schematic diagram of a first human-computer interaction interface provided by an embodiment of the present application;
fig. 7B is a schematic diagram of presenting candidate accounts in a first human-computer interaction interface according to an embodiment of the present application;
FIG. 7C is a schematic diagram of presenting a synchronization portal in a first human-computer interaction interface according to an embodiment of the present application;
FIG. 7D is a schematic diagram showing a synchronization request in a first human-computer interaction interface according to an embodiment of the present application;
FIG. 7E is a schematic diagram of presenting synchronization entries corresponding to different media types in a first human-computer interaction interface according to an embodiment of the present application;
FIG. 7F is a schematic diagram showing interactive information in a first human-computer interaction interface according to an embodiment of the present application;
FIG. 7G is a schematic diagram showing an interactive portal in a first human-computer interactive interface according to an embodiment of the present application;
FIG. 7H is a schematic diagram showing a connection request in a first human-computer interaction interface according to an embodiment of the present application;
FIG. 7I is a schematic diagram showing interactive entries corresponding to different media types in a first human-computer interactive interface according to an embodiment of the present application;
FIG. 7J is a schematic diagram showing the status of the interactive connection, the interactive information, the ending of the interactive portal and the ending of the synchronous portal in the first human-computer interactive interface according to the embodiment of the present application;
FIG. 8A is a schematic diagram of a second human-computer interaction interface provided by an embodiment of the present application;
FIG. 8B is a schematic diagram of presenting a synchronization request in a second human-machine interface according to an embodiment of the present application;
FIG. 8C is a schematic diagram showing the presentation of advisory information in a second human-machine interface provided by an embodiment of the present application;
FIG. 9A is a schematic diagram of a game virtual scene presented in a first human-computer interaction interface according to an embodiment of the present application;
FIG. 9B is a schematic diagram of inviting a companion to play based on a first human-computer interaction interface provided by an embodiment of the present application;
FIG. 9C is a schematic diagram of a first human-computer interaction interface-based communication initiation provided by an embodiment of the application;
FIG. 9D is a schematic diagram of confirmation/rejection of a wheat company invitation based on a second human-computer interaction interface according to an embodiment of the present application;
FIG. 9E is a schematic diagram of a first human-computer interaction interface-based wheat linking maintenance provided by an embodiment of the present application;
FIG. 10A is a schematic diagram of inviting a companion to play based on a second human-machine interface provided by an embodiment of the present application;
FIG. 10B is a schematic diagram of a second human-computer interaction interface-based communication initiation provided by an embodiment of the present application;
FIG. 10C is a schematic diagram of confirmation/rejection of a wheat company invitation based on a first human-computer interaction interface according to an embodiment of the present application;
FIG. 10D is a schematic diagram of a second human-computer interaction interface-based wheat connection maintenance provided by an embodiment of the present application;
FIG. 11 is a schematic diagram of a screen view sharing and voice call provided by an embodiment of the present application;
Fig. 12 is a fourth flowchart of an interaction method of a virtual scene according to an embodiment of the present application.
Detailed Description
The present application will be further described in detail with reference to the accompanying drawings, for the purpose of making the objects, technical solutions and advantages of the present application more apparent, and the described embodiments should not be construed as limiting the present application, and all other embodiments obtained by those skilled in the art without making any inventive effort are within the scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is to be understood that "some embodiments" can be the same subset or different subsets of all possible embodiments and can be combined with one another without conflict.
In the following description, the terms "first", "second", "third" and the like are merely used to distinguish similar objects and do not represent a particular ordering of the objects, it being understood that the "first", "second", "third" may be interchanged with a particular order or sequence, as permitted, to enable embodiments of the application described herein to be practiced otherwise than as illustrated or described herein. In the following description, the term "plurality" refers to at least two.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the application only and is not intended to be limiting of the application.
Before describing embodiments of the present application in further detail, the terms and terminology involved in the embodiments of the present application will be described, and the terms and terminology involved in the embodiments of the present application will be used in the following explanation.
1) Virtual scene: the method has the advantages that a scene which is output by the electronic equipment and is different from the real world is utilized, visual perception of a virtual scene can be formed through naked eyes or the assistance of the equipment, for example, a two-dimensional image output by a display screen is utilized, and a three-dimensional image output by three-dimensional display technologies such as three-dimensional projection, virtual reality and augmented reality technologies is utilized; in addition, various simulated real world sensations such as auditory sensations, tactile sensations, olfactory sensations, and motion sensations can also be formed by various possible hardware. The virtual scene may be a simulation environment for the real world, a semi-simulation and semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application.
2) In response to: for representing a condition or state upon which an operation is performed, one or more operations performed may be in real-time or with a set delay when the condition or state upon which the operation is dependent is satisfied; without being specifically described, there is no limitation in the execution sequence of the plurality of operations performed.
3) Virtual object: the avatars of various people and objects in the virtual scene that can interact with, or movable objects in the virtual scene. The movable object may be a virtual character, a virtual animal, a cartoon character, etc., such as a character, an animal, a plant, an oil drum, a wall, a stone, etc., displayed in a virtual scene. The virtual object may be an avatar in the virtual scene for representing a user. A virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene, occupying a portion of space in the virtual scene. In the embodiment of the application, the control operation for the virtual scene may be a control operation of a pointer on a virtual object in the virtual scene, for example, in a virtual scene of a chess game, the control operation may be a card indicating operation.
4) Cloud Technology (Cloud Technology): the hosting technology for calculating, storing, processing and sharing data is realized by unifying hardware, software, network and other serial resources in a wide area network or a local area network. The embodiment of the application can be realized in combination with cloud technology, and for example, the related electronic equipment can be cloud equipment.
5) Intelligent transportation system (Intelligent Traffic System, ITS): the intelligent transportation system (Intelligent Transportation System) is a comprehensive transportation system which effectively and comprehensively applies advanced scientific technologies (information technology, computer technology, data communication technology, sensor technology, electronic control technology, automatic control theory, operation study, artificial intelligence and the like) to transportation, service control and vehicle manufacturing, and strengthens the connection among vehicles, roads and users, thereby forming the comprehensive transportation system which ensures safety, improves efficiency, improves environment and saves energy. The embodiment of the application can be applied to an intelligent traffic system to realize intelligent traffic, for example, an interactive method of a virtual scene can be realized in a vehicle-mounted terminal, wherein the virtual scene is obtained by modeling the real road condition, so that the running experience and skill of a user using the vehicle-mounted terminal can be improved.
For the virtual scene, in the scheme provided by the related art, the user usually controls the virtual scene according to own experience and technology, and before the user actually controls, the user is usually informed about how to control by playing a course. Taking a virtual scene of a mahjong game as an example, a schematic diagram of a game course of fig. 1A to 1F is provided, wherein fig. 1A shows a type of a face of a mahjong tile, fig. 1B shows a type of a tile combination of the mahjong tile, and fig. 1A and 1B are used for providing a tile recognition guiding capability; FIG. 1C is used to provide the ability to play a card guide; fig. 1D shows a "eat" operation in mahjong, fig. 1E shows a "bump" operation in mahjong, and fig. 1F shows a "fiddle" operation in mahjong. However, the course can only let the player (user) know how to play the game, but cannot let the player understand the real playing method of the mahjong and the playing logic (when and what cards are played), so that the player does not have the technique of playing the game with other players after learning the course, and the operation performed in the actual playing process can be ineffective or even have negative effects. Therefore, the scheme provided by the related art cannot effectively and practically raise the operation level of the user, and at the same time, the computing resources of the electronic device for responding to the control operation are consumed meaninglessly, that is, the actual utilization rate of the computing resources consumed by the electronic device is low.
The embodiment of the application provides an interaction method, an interaction device, electronic equipment, a computer readable storage medium and a computer program product for a virtual scene, which can improve the control effect on the virtual scene and improve the actual utilization rate of computing resources consumed by the electronic equipment. An exemplary application of the electronic device provided by the embodiment of the present application is described below, and the electronic device provided by the embodiment of the present application may be implemented as various types of terminal devices.
Referring to fig. 2, fig. 2 is a schematic diagram of an architecture of an interaction system 100 of a virtual scenario according to an embodiment of the present application, where a server 200, a terminal device 400, and a terminal device 500 may be connected through a network 300, where the network 300 may be a wide area network or a local area network, or a combination of the two.
In some embodiments, taking an electronic device as an example of a terminal device, the interaction method of the virtual scene provided in the embodiments of the present application may be implemented by the terminal device. For example, the terminal device 400 presents a virtual scene in a first human-computer interaction interface; the virtual scene is logged in based on a first account; and responding to the synchronous triggering operation aiming at the virtual scene, and synchronizing the virtual scene to a second human-computer interaction interface of the corresponding second account of the terminal equipment 500. The terminal equipment 500 presents the virtual scene synchronized based on the first man-machine interaction interface in a second man-machine interaction interface corresponding to the second account; receiving interaction information aiming at the virtual scene through a second man-machine interaction interface; the interaction information is sent to a first account; the interaction information is used for guiding the control operation of the first account in the virtual scene. The terminal device 400 presents the received interaction information in the first man-machine interaction interface. In this way, for the user of the terminal device 400, the control operation for the virtual scene can be performed according to the interaction information, and the control effect on the virtual scene can be improved.
In some embodiments, the interaction method of the virtual scene provided by the embodiment of the application can be cooperatively implemented by the terminal device and the server. Here, the communication between the terminal device 400 and the terminal device 500 may be implemented by means of the server 200, for example, the terminal device 400 may synchronize the virtual scene to the terminal device 500 by means of the server 200; for another example, the terminal device 500 may transmit the interaction information to the server 200, so that the server 200 transmits the interaction information to the terminal device 400.
In some embodiments, various results (such as virtual scenes, interaction information, etc.) involved in the interaction process of the virtual scenes can be stored in the blockchain, and the accuracy of the data in the blockchain can be ensured because the blockchain has the characteristic of non-falsification. The electronic device may send a query request to the blockchain to query data stored in the blockchain.
In some embodiments, the terminal device (such as the terminal device 400 or the terminal device 500 shown in fig. 2) or the server (such as the server 200 shown in fig. 2) may implement the interaction method of the virtual scene provided by the embodiment of the present application by running a computer program, for example, the computer program may be a native program or a software module in an operating system; may be a Native Application (APP), i.e. a program that needs to be installed in an operating system to run, such as a game Application; the method can also be an applet, namely a program which can be run only by being downloaded into a browser environment; but also an applet that can be embedded in any APP. In general, the computer programs described above may be any form of application, module or plug-in. For the game application, it may be any one of a chess game, a multiplayer online tactical game (Multiplayer Online Battle Arena, MOBA) game, a First-Person shooter (FPS) game, a Third-Person shooter (Third-Personal Shooting, TPS) game, or a multiplayer warfare survival game, without limitation.
In some embodiments, the server (such as the server 200 shown in fig. 2) may be an independent physical server, or may be a server cluster or a distributed system formed by a plurality of physical servers, or may be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content delivery networks (Content Delivery Network, CDNs), and basic cloud computing services such as big data and artificial intelligence platforms. The terminal device (such as the terminal device 400 or the terminal device 500 shown in fig. 2) may be a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, a vehicle-mounted terminal, a smart television, or the like, but is not limited thereto. The terminal device and the server may be directly or indirectly connected through wired or wireless communication, which is not limited in the embodiment of the present application.
Taking the electronic device provided by the embodiment of the present application as an example of a terminal device, it can be understood that, in a case where the electronic device is a server, portions (such as a user interface, a presentation module, and an input processing module) in the structure shown in fig. 3A may be default. Referring to fig. 3A, fig. 3A is a schematic structural diagram of a terminal device 400 according to an embodiment of the present application, and the terminal device 400 shown in fig. 3A includes: at least one processor 410, a memory 450, at least one network interface 420, and a user interface 430. The various components in terminal device 400 are coupled together by bus system 440. It is understood that the bus system 440 is used to enable connected communication between these components. The bus system 440 includes a power bus, a control bus, and a status signal bus in addition to the data bus. But for clarity of illustration the various buses are labeled as bus system 440 in fig. 3A.
The processor 410 may be an integrated circuit chip having signal processing capabilities such as a general purpose processor, such as a microprocessor or any conventional processor, or the like, a digital signal processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or the like.
The user interface 430 includes one or more output devices 431, including one or more speakers and/or one or more visual displays, that enable presentation of the media content. The user interface 430 also includes one or more input devices 432, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
Memory 450 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard drives, optical drives, and the like. Memory 450 optionally includes one or more storage devices physically remote from processor 410.
Memory 450 includes volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a random access Memory (RAM, random Access Memory). The memory 450 described in embodiments of the present application is intended to comprise any suitable type of memory.
In some embodiments, memory 450 is capable of storing data to support various operations, examples of which include programs, modules and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 451 including system programs, e.g., framework layer, core library layer, driver layer, etc., for handling various basic system services and performing hardware-related tasks, for implementing various basic services and handling hardware-based tasks;
a network communication module 452 for accessing other electronic devices via one or more (wired or wireless) network interfaces 420, the exemplary network interface 420 comprising: bluetooth, wireless compatibility authentication (WiFi), and universal serial bus (USB, universal Serial Bus), etc.;
a presentation module 453 for enabling presentation of information (e.g., a user interface for operating peripheral devices and displaying content and information) via one or more output devices 431 (e.g., a display screen, speakers, etc.) associated with the user interface 430;
an input processing module 454 for detecting one or more user inputs or interactions from one of the one or more input devices 432 and translating the detected inputs or interactions.
In some embodiments, the interaction device of the virtual scene provided by the embodiments of the present application may be implemented in software, and fig. 3A shows the interaction device 455 of the virtual scene stored in the memory 450, which may be software in the form of a program and a plug-in, and includes the following software modules: scene presentation module 4551, scene synchronization module 4552, and information presentation module 4553, which are logical, and thus may be arbitrarily combined or further split depending on the functions implemented. The functions of the respective modules will be described hereinafter.
Referring to fig. 3B, fig. 3B is a schematic structural diagram of a terminal device 500 according to an embodiment of the present application, where the terminal device 500 shown in fig. 3B includes: at least one processor 510, a memory 550, at least one network interface 520, and a user interface 530. The various components in the terminal device 500 are coupled together by a bus system 540. It is appreciated that the bus system 540 is used to enable connected communications between these components. The bus system 540 includes a power bus, a control bus, and a status signal bus in addition to the data bus. But for clarity of illustration, the various buses are labeled as bus system 540 in fig. 3B.
The processor 510 may be an integrated circuit chip having signal processing capabilities such as a general purpose processor, DSP, or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, where the general purpose processor may be a microprocessor or any conventional processor or the like.
The user interface 530 includes one or more output devices 531 that enable presentation of media content, including one or more speakers and/or one or more visual displays. The user interface 530 also includes one or more input devices 532, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 550 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard drives, optical drives, and the like. Memory 550 may optionally include one or more storage devices physically located remote from processor 510.
Memory 550 includes volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The non-volatile memory may be ROM and the volatile memory may be RAM. The memory 550 described in embodiments of the present application is intended to comprise any suitable type of memory.
In some embodiments, memory 550 is capable of storing data to support various operations, examples of which include programs, modules and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 551 including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
network communication module 552 is used to reach other electronic devices via one or more (wired or wireless) network interfaces 520, exemplary network interfaces 520 include: bluetooth, wireless compatibility authentication (WiFi), USB, etc.;
a presentation module 553 for enabling presentation of information (e.g., a user interface for operating a peripheral device and displaying content and information) via one or more output devices 531 (e.g., a display screen, speakers, etc.) associated with the user interface 530;
the input processing module 554 is configured to detect one or more user inputs or interactions from one of the one or more input devices 532 and translate the detected inputs or interactions.
In some embodiments, the interaction device of the virtual scene provided by the embodiments of the present application may be implemented in a software manner, and fig. 3B shows the interaction device 555 of the virtual scene stored in the memory 550, which may be software in the form of a program, a plug-in, or the like, including the following software modules: the synchronous rendering module 5551, the receiving module 5552 and the transmitting module 5553 are logical, and thus may be arbitrarily combined or further split according to the implemented functions. The functions of the respective modules will be described hereinafter.
The interaction method of the virtual scene provided by the embodiment of the application will be described in combination with the exemplary application and implementation of the electronic device provided by the embodiment of the application. As an example, a flowchart of an interaction method of the virtual scene shown in fig. 4, fig. 5 and fig. 6 is provided, where fig. 4 corresponds to the terminal device 400 shown in fig. 2; fig. 5 corresponds to the terminal device 500 shown in fig. 2; fig. 6 corresponds to the interaction procedure between the terminal device 400 and the terminal device 500 shown in fig. 2.
First, referring to fig. 4, the steps shown in fig. 4 will be described.
In step 101, presenting a virtual scene in a first human-computer interaction interface; the virtual scene is logged in based on the first account number.
Here, a virtual scene is presented in the first man-machine interaction interface corresponding to the first account, where the virtual scene is logged in based on the first account, for example, is observed from a view angle of the first account.
It should be noted that, the virtual scene may occupy all the area of the first man-machine interaction interface, or may occupy a part of the area, and the remaining area may be an interaction area, so as to embody related information in the interaction process of the virtual scene. As an example, referring to the first man-machine interaction interface shown in fig. 7A, the virtual scene 71 occupies only a partial area, and the remaining area is the interaction area.
In step 102, in response to the synchronization triggering operation for the virtual scene, the virtual scene is synchronized to a second human-computer interaction interface corresponding to the second account.
Here, when a synchronization trigger operation for the virtual scene is received, the virtual scene is synchronized to the second human-computer interaction interface corresponding to the second account. It should be noted that, in the embodiment of the present application, the triggering manner of the related triggering operation (e.g., synchronous triggering operation) is not limited, and may be, for example, touch triggering, such as clicking, long pressing, etc., or non-touch triggering, such as voice input, gesture input, etc.
It should be noted that the second account number refers to one type of account number, that is, the number of the second account number may be one or more.
In some embodiments, before synchronizing the virtual scene to the second human-computer interaction interface corresponding to the second account, the method further includes: presenting a plurality of candidate accounts different from the first account in a first human-computer interaction interface; and responding to the selection operation of the plurality of candidate accounts, and taking the selected candidate account as a second account.
For example, when a synchronous triggering operation for a virtual scene is received, a plurality of candidate accounts different from the first account may be presented in the first man-machine interaction interface, where the candidate accounts may be presented in the virtual scene or in the interaction area.
As an example, referring to fig. 7B, when a synchronization trigger operation is received in the first human-computer interaction interface illustrated in fig. 7A, a candidate account may be presented in the virtual scene, wherein, when the candidate account is presented, at least one of an account identification of the candidate account and an account attribute parameter may be presented, the account identification including at least one of a name and an avatar, and the account attribute parameter includes at least one of a rank, a number of virtual resources, and a number of rounds.
When a selection operation for a plurality of candidate accounts is received, the candidate account selected by the selection operation is taken as a second account, wherein the selection operation is such as a triggering operation for an invitation control shown in fig. 7B.
In some embodiments, before presenting the plurality of candidate accounts distinct from the first account in the first human-computer interaction interface, at least one of the following is also performed: taking an account with a friend relation with the first account as a candidate account; determining a synchronous geographic range centering on the geographic position of the first account, and taking the account in the synchronous geographic range as a candidate account; taking the history synchronous account number of the first account number as a candidate account number; wherein the history synchronization account represents an account that historically was synchronized by the first account with the virtual scene; determining a historical operation record of the first account in the virtual scene, and taking an account with a target relation with the first account in the historical operation record as a candidate account; wherein the target relationship includes at least one of a synergistic relationship, an antagonistic relationship, and a neutral relationship; determining parameter conditions according to account attribute parameters of the first account in the virtual scene, and taking the account with the account attribute parameters meeting the parameter conditions as a candidate account; wherein the account attribute parameters include at least one of a level, a number of virtual resources, and a number of games.
In the embodiment of the application, all accounts different from the first account can be used as candidate accounts, and a plurality of accounts different from the first account can be screened to obtain candidate accounts. Wherein the screening treatment mode comprises at least one of the following steps:
1) And taking the account with the friend relation with the first account as a candidate account. In some embodiments, an account with a friend affinity greater than an affinity threshold with the first account may also be used as a candidate account.
2) And determining a synchronous geographic range centering on the geographic position of the first account, and taking the account in the synchronous geographic range as a candidate account. The shape and area of the synchronous geographic range are not limited, and can be set according to actual application scenes.
3) And taking the history synchronous account of the first account as a candidate account, wherein the history synchronous account represents the account of the virtual scene which is synchronized by the first account in history, namely the history synchronous account refers to the account which interacts with the first account in history.
4) And determining a historical operation record of the first account in the virtual scene, and taking the account with a target relation with the first account in the historical operation record as a candidate account, wherein the target relation comprises at least one of a cooperative relation, a countermeasure relation and a neutral relation. That is, the candidate account number may include at least one of a teammate, an opponent, and a neutral account number of the first account number in history. The historical operation record in the historical time period can be acquired, and the historical time period can be set according to an actual application scene.
5) And determining a parameter condition according to the account attribute parameters of the first account in the virtual scene, and taking the account with the account attribute parameters meeting the parameter condition as a candidate account, wherein the account attribute parameters comprise at least one of the grade, the virtual resource quantity and the game quantity. For example, the account attribute parameters of the first account in the virtual scene may be subjected to parameter amplification (e.g. multiplied by 2) to obtain a first threshold, and an account with the account attribute parameter greater than the first threshold is used as a candidate account, so as to ensure the advantage of the candidate account on the account attribute parameters (compared with the first account).
Thus, the flexibility of screening candidate accounts can be improved.
In some embodiments, before synchronizing the virtual scene to the second human-computer interaction interface corresponding to the second account, the method further includes: screening the multiple candidate accounts according to screening parameters respectively corresponding to the multiple candidate accounts different from the first account to obtain a second account; wherein the screening parameters include at least one of: friend affinity with the first account; a geographic location distance from the first account; historical synchronization times with the first account; the number of occurrences in the historical operating record of the first account; account attribute parameters in the virtual scene; the account attribute parameters include at least one of a rank, a number of virtual resources, and a number of games.
Besides manually selecting the second account, the embodiment of the application can also realize automatic selection of the second account. For example, according to screening parameters respectively corresponding to a plurality of candidate accounts different from the first account, screening the plurality of candidate accounts to obtain a second account, wherein the screening parameters include at least one of the following: friend affinity with the first account; a geographic location distance from the first account; historical synchronization times with the first account; the number of occurrences in the historical operating record of the first account; account attribute parameters in the virtual scene; the account attribute parameters include at least one of a rank, a number of virtual resources, and a number of games.
The screening target for screening the second account may include at least one of the following: the friend affinity between the second account and the first account is the largest; the geographic position distance between the second account and the first account is the smallest; the history synchronization times between the second account and the first account are the most; the second account number has the largest occurrence number in the historical operation record of the first account number; the account attribute parameter of the second account in the virtual scene is the largest; the account attribute parameters include at least one of a rank, a number of virtual resources, and a number of games.
In addition, screening parameters corresponding to the candidate accounts respectively can be processed through the artificial intelligent model, and the screened second account is obtained.
Through the mode, user operation can be reduced, and the most suitable second account number can be automatically screened out.
In some embodiments, before synchronizing the virtual scene to the second human-computer interaction interface corresponding to the second account, the method further includes: presenting a synchronization entry for the virtual scene in the first human-computer interaction interface; and taking the triggering operation aiming at the synchronous entrance as the synchronous triggering operation aiming at the virtual scene.
For example, the synchronization portal for the virtual scene may be presented anywhere in the first human-computer interaction interface, such as in the virtual scene (synchronization portal 1 shown in fig. 7C), and such as in the interaction region (synchronization portal 2 shown in fig. 7C). Then, the trigger operation for the synchronization portal is taken as the synchronous trigger operation for the virtual scene.
It should be noted that, the portal (e.g., synchronization portal) according to the embodiments of the present application may be implemented in the form of a control, which is not limited to this.
In some embodiments, the above-mentioned synchronizing the virtual scene to the second human-computer interaction interface corresponding to the second account number in response to the synchronization triggering operation for the virtual scene may be implemented in such a manner that: responding to a synchronous triggering operation aiming at the virtual scene, and sending a synchronous request to a second account; the synchronization request comprises at least one of account identification of the first account and scene identification of the virtual scene; and synchronizing the virtual scene to a second man-machine interaction interface corresponding to the second account when receiving the confirmation information of the second account for the synchronization request.
When the synchronization triggering operation aiming at the virtual scene is received, the synchronization can be directly performed, or the second account can be used for confirmation and then the synchronization is performed, and the effectiveness of the synchronization can be improved. For example, the first account sends a synchronization request to the second account, wherein the synchronization request includes at least one of an account identification of the first account and a scene identification of the virtual scene.
And synchronizing the virtual scene to a second man-machine interaction interface corresponding to the second account when receiving the confirmation information of the second account for the synchronization request.
In some embodiments, before synchronizing the virtual scene to the second human-computer interaction interface corresponding to the second account, the method further includes: presenting a synchronous request sent by a second account in a first man-machine interaction interface; the synchronization request comprises an account identifier of the second account; and taking the confirmation operation for the synchronous request as a synchronous triggering operation for the virtual scene.
Here, the second account may also actively request synchronization. For example, a synchronization request sent by the second account may be presented in the first human-computer interaction interface, where the synchronization request includes an account identification of the second account. As an example, referring to fig. 7D, the synchronization request sent by the second account is shown in the virtual scene of the first man-machine interaction interface, and of course, the synchronization request may also be presented in the interaction area.
When a confirmation operation for the synchronization request is received in the first man-machine interaction interface, the confirmation operation is used as a synchronization triggering operation for the virtual scene. In fig. 7D, a confirmation operation for the synchronization request is as a trigger operation for the "accept" control.
In some embodiments, when synchronizing the virtual scene to the second human-computer interaction interface corresponding to the second account, further comprising: establishing interactive connection between a first account and a second account; the interactive connection is used for receiving the interactive information sent by the second account based on the second man-machine interaction interface.
Here, in response to the synchronous triggering operation for the virtual scene, the interactive connection between the first account and the second account may be directly established. The established interactive connection can be one-way interactive connection, namely, the interactive connection is used for receiving the interactive information sent by the second account based on the second man-machine interaction interface; the two-way interaction connection can also be used for receiving interaction information sent by the second account based on the second man-machine interaction interface and sending consultation information of the first account.
It should be noted that, the communication connection used for the interactive connection and the synchronous virtual scene may be the same communication connection or different communication connections.
In some embodiments, the above-mentioned establishment of the interactive connection between the first account and the second account may be achieved by: establishing interactive connection between a first account and a second account according to the target media type; wherein the target media type represents the media type specified by the synchronization triggering operation; the interactive connection is used for transmitting interactive information conforming to the target media type.
Here, the synchronization trigger operation may also designate a media type for establishing the interactive connection, and the media type designated by the synchronization trigger operation is designated as a target media type for convenience of distinction. As shown in fig. 7E, synchronization entries corresponding to multiple media types (such as text, voice, and video) may be presented in the man-machine interface, and the media type corresponding to the triggered synchronization entry is used as the target media type.
In this way, in response to the synchronous triggering operation for the virtual scene, an interactive connection between the first account and the second account is established according to the target media type, and the interactive connection is used for transmitting interactive information (or consultation information) conforming to the target media type.
In some embodiments, the above-mentioned synchronization of the virtual scene to the second human-computer interaction interface corresponding to the second account may be implemented in such a manner: when the synchronization condition is met, synchronizing the virtual scene to a second human-computer interaction interface corresponding to a second account; wherein the synchronization condition includes at least one of: the duration of the first account in the virtual scene, which is not operated, reaches a duration threshold; the account attribute parameter of the first account in the virtual scene is smaller than a parameter threshold; wherein the account attribute parameters include at least one of a level, a number of virtual resources, and a number of rounds; the predicted odds of the first account number from the image data of the virtual scene is less than a Yu Sheng rate threshold.
Here, the synchronization condition may be set, and when the synchronization condition is satisfied, the virtual scene is synchronized to the second man-machine interaction interface corresponding to the second account. The synchronization conditions include at least one of: the duration of the first account number in the virtual scene that is not operated reaches a duration threshold (proving that the user using the first account number does not know how to operate); the account attribute parameter of the first account in the virtual scene is smaller than a parameter threshold, wherein the account attribute parameter comprises at least one of a grade, a virtual resource quantity and a game quantity, and the parameter threshold can be set in advance; the predicted odds of the first account number from the image data of the virtual scene is less than a Yu Sheng rate threshold. The prediction of the win ratio can be implemented depending on a trained win ratio prediction model, and in a training stage of the win ratio prediction model, training samples are (image data of a virtual scene in which a sample account logs in, and an operation result of the sample account), and the operation result comprises win and failure.
In some embodiments, the above-mentioned synchronization of the virtual scene to the second human-computer interaction interface corresponding to the second account may be implemented in such a manner: any one of the following processes is performed: responding to a synchronous region selection operation aiming at the virtual scene, and synchronizing the selected synchronous region to a second human-computer interaction interface; sensitive information shielding processing is carried out on the virtual scene, and the shielded virtual scene is synchronized to a second human-computer interaction interface; and receiving the synchronous region parameters sent by the second account, and synchronizing the synchronous region corresponding to the synchronous region parameters in the virtual scene to the second man-machine interaction interface.
In the synchronization process, the full-scale virtual scenes can be synchronized, and part of the virtual scenes can be synchronized. For example, the area to be synchronized can be determined by the following three ways:
1) And responding to the synchronous region selection operation aiming at the virtual scene, and synchronizing the selected synchronous region to the second human-computer interaction interface. The first account can select the synchronous area, wherein the first account can select the synchronous area through intercepting operation, and can also select the synchronous area through inputting synchronous area parameters.
2) And carrying out sensitive information shielding processing on the virtual scene, and synchronizing the shielded virtual scene to a second man-machine interaction interface. The sensitive information can be set according to the actual application scene, such as account identification appearing in the virtual scene. Thus, the safety of synchronization can be improved.
3) And receiving the synchronous region parameters sent by the second account, and synchronizing the synchronous region corresponding to the synchronous region parameters in the virtual scene to the second man-machine interaction interface. The second account can select the synchronization area, so that the effectiveness of synchronization can be improved.
In some embodiments, between any of the steps, further comprising: carrying out semantic extraction processing on the image data of the virtual scene to obtain consultation information; and sending the consultation information to the second account.
Here, semantic extraction processing may be performed on image data of the virtual scene to obtain semantic information in the virtual scene. For example, for a mahjong game, semantic extraction processing may be performed on image data of a virtual scene in a game play pair to obtain semantic information including cards already in the game play pair and cards held by a first account number.
After the semantic information is obtained, the semantic information can be directly used as the consultation information, and a structural sentence (such as 'how the next card is out') can be added into the semantic information to obtain the consultation information. And then, the consultation information is sent to the second account, so that the second account can quickly know the questions of the first account according to the consultation information, and timely feed back the interaction information.
It should be noted that, the consultation information may also be received through the first man-machine interaction interface, that is, input by the user using the first account.
In some embodiments, the above-mentioned synchronization of the virtual scene to the second human-computer interaction interface corresponding to the second account may be implemented in such a way that: coding the image data of the virtual scene to obtain a coded data packet; and sending the encoded data packet to a second account, so that the second account decodes the encoded data packet, and presents a virtual scene in a second man-machine interaction interface according to the decoded image data.
Here, the same principle of virtual scenes can be achieved by the principle of codec. For example, image data of a virtual scene may be encoded to obtain an encoded data packet. And then, the encoded data packet is sent to a second account, so that the second account decodes the encoded data packet, and a virtual scene is presented in a second man-machine interaction interface according to the decoded image data. In this way, quick and accurate synchronization can be achieved based on image data.
In some embodiments, the above-described encoding of image data of a virtual scene may be implemented in such a way that: any one of the following processes is performed: periodically encoding image data of the virtual scene; and carrying out real-time detection processing on the image data of the virtual scene, and carrying out coding processing on the updated image data when the image data is detected to be updated.
Here, the image data of the virtual scene may be encoded periodically (e.g., once a second), or the image data of the virtual scene may be detected in real time, and when the update of the image data is detected, the updated image data may be encoded. The former can ensure timeliness and accuracy of synchronization, and the latter can reduce consumption of computing resources, namely, synchronization is performed when images change.
In step 103, receiving interaction information sent by the second account based on the second man-machine interaction interface, and presenting the interaction information in the first man-machine interaction interface; the interaction information is used for guiding the control operation of the first account in the virtual scene.
The interaction information sent by the second account based on the second man-machine interaction interface is presented in the first man-machine interaction interface, so that a user holding the first account can better control the virtual scene according to the interaction information.
It should be noted that the interactive information may be presented in the virtual scene or in the interactive area, and as an example, the synchronous area in the first man-machine interactive interface in fig. 7F shows the interactive information "XX card should be played next" with text as the media type. The output mode of the interactive information is determined according to the media type of the interactive information, and is not limited to presentation, for example, when the interactive information is voice information, the voice information can be played.
In some embodiments, prior to step 103, further comprising: responding to an interaction triggering operation aiming at a virtual scene, and establishing interaction connection between a first account and a second account; the interactive connection is used for receiving the interactive information sent by the second account based on the second man-machine interaction interface.
Here, the interactive connection may also be established based on an interactive trigger operation, which is different from the synchronous trigger operation, and the established interactive connection is also different from the communication connection used for synchronizing the virtual scene.
In some embodiments, the above-mentioned interactive triggering operation in response to the virtual scene may be implemented in such a manner that an interactive connection between the first account and the second account is established: presenting an interaction inlet for the virtual scene in a first human-computer interaction interface; and taking the triggering operation aiming at the interaction entrance as the interaction triggering operation aiming at the virtual scene.
Here, when the interaction triggering operation for the virtual scene is received, an interaction entry for the virtual scene may be presented in the first man-machine interaction interface, and the interaction entry may be presented in the virtual scene (e.g. the interaction entry 1 shown in fig. 7G) or may be presented in the interaction region (e.g. the interaction entry 2 shown in fig. 7G).
On the basis of the presented interactive portal, the triggering operation for the interactive portal can be used as the interactive triggering operation for the virtual scene.
In some embodiments, the above-mentioned interactive triggering operation in response to the virtual scene may be implemented in such a manner that an interactive connection between the first account and the second account is established: responding to the interaction triggering operation aiming at the virtual scene, and sending a connection request to a second account; the connection request comprises at least one of account identification of the first account and scene identification of the virtual scene; and when receiving the confirmation information of the second account for the connection request, establishing interactive connection between the first account and the second account.
Here, when the interaction triggering operation for the virtual scene is received, the interaction connection can be directly established, or the interaction connection can be established after confirmation of the second account number.
For example, a connection request may be sent to the second account, where the connection request includes at least one of an account identification of the first account and a scene identification of the virtual scene; and when receiving the confirmation information of the second account for the connection request, establishing interactive connection between the first account and the second account. Thus, the effectiveness of the established interactive connection can be improved.
In some embodiments, before establishing the interactive connection between the first account and the second account, the method further includes: receiving a connection request sent by a second account based on a second man-machine interaction interface; the connection request comprises an account identifier of the second account; and taking the confirmation operation aiming at the connection request as the interaction triggering operation aiming at the virtual scene.
Here, the second account may actively initiate the connection request. For the first account, when a connection request sent by the second account based on the second man-machine interaction interface is received, the connection request can be presented at any position of the first man-machine interaction interface, and the confirmation operation for the connection request is used as the interaction triggering operation for the virtual scene. As an example, a schematic diagram as in fig. 7H is provided, and the confirmation operation for the connection request is the triggering operation for the "consent" control.
In some embodiments, the above-mentioned establishment of the interactive connection between the first account and the second account may be achieved by: establishing interactive connection between a first account and a second account according to the target media type; wherein the target media type represents the media type specified by the interactive triggering operation; the interactive connection is used for transmitting interactive information conforming to the target media type.
Here, the interactive triggering operation may further designate a media type for establishing the interactive connection, and for convenience of distinction, the media type designated by the interactive triggering operation is designated as a target media type. As shown in fig. 7I, an interaction entry corresponding to each of a plurality of media types (such as text, voice, and video) may be presented in the human-computer interaction interface, and the media type corresponding to the triggered interaction entry is used as the target media type.
In this way, in response to the interaction triggering operation for the virtual scene, an interaction connection between the first account and the second account is established according to the target media type, and the interaction connection is used for transmitting interaction information (or consultation information) conforming to the target media type.
In some embodiments, between any of the steps, further comprising: presenting at least one of a state of interactive connection, interactive information, ending an interactive portal and ending a synchronous portal in a first man-machine interactive interface; the interactive connection is used for receiving the interactive information sent by the second account; ending the interactive portal for disconnecting the interactive connection when triggered; and the end synchronization entry is used for disconnecting the interactive connection when triggered and stopping synchronizing the virtual scene to the second man-machine interaction interface.
Here, at least one of a state of the interactive connection, the interactive information, the end interactive portal, and the end synchronization portal may be presented at any position in the first human-computer interactive interface, as an example, see fig. 7J, where the state of the interactive connection may include both an unestablished and an established (online) state. Therefore, a user using the first account is convenient to know the related conditions in the interaction process.
As shown in fig. 4, in the embodiment of the present application, the control effect of the first account on the virtual scene may be improved by synchronizing the virtual scene and transmitting the interaction information, and meanwhile, the actual utilization rate of the computing resources consumed by the electronic device (such as the terminal device where the first account is located) may also be improved.
Referring to fig. 5, fig. 5 is a flowchart of an interaction method of a virtual scene according to an embodiment of the present application, and will be described with reference to the steps shown in fig. 5 from the perspective of the terminal device 500 shown in fig. 2.
In step 201, presenting a virtual scene synchronized based on the first man-machine interaction interface in a second man-machine interaction interface corresponding to a second account; the virtual scene is logged in based on the first account number.
Here, when the second account receives the virtual scene synchronized by the first account based on the first man-machine interaction interface, the virtual scene may be presented in the second man-machine interaction interface, and as an example, referring to fig. 8A, the virtual scene 81 shown in the second man-machine interaction interface is the virtual scene synchronized by the first account. The first account and the second account may be accounts in the same application program.
In some embodiments, prior to step 201, further comprising: presenting a plurality of candidate accounts different from the second account in a second human-computer interaction interface; and responding to the selection operation of the plurality of candidate accounts, and taking the selected candidate account as a first account.
For the second account, multiple candidate accounts different from the second account may also be presented in the second human-computer interaction interface, similar to fig. 7B. And when the selection operation for the plurality of candidate accounts is received in the second man-machine interaction interface, the selected candidate account is used as the first account. At this time, a synchronization request may be sent to the first account to prompt the first account to synchronize the virtual scene.
In some embodiments, further comprising: performing at least one of the following: taking an account with a friend relation with the second account as a candidate account; determining a synchronous geographic range centering on the geographic position of the second account, and taking the account in the synchronous geographic range as a candidate account; taking the history synchronous account number of the second account number as a candidate account number; wherein the history synchronization account represents an account that historically was synchronized with the virtual scene by the second account; determining a historical operation record of the second account in the virtual scene, and taking an account with a target relation with the second account in the historical operation record as a candidate account; wherein the target relationship includes at least one of a synergistic relationship, an antagonistic relationship, and a neutral relationship; determining parameter conditions according to account attribute parameters of the second account in the virtual scene, and taking the account with the account attribute parameters meeting the parameter conditions as a candidate account; wherein the account attribute parameters include at least one of a level, a number of virtual resources, and a number of games.
The account attribute parameters of the second account in the virtual scene may be weakened (e.g., divided by 2) to obtain a second threshold, and an account with the account attribute parameters smaller than the second threshold (i.e., parameter adjustment) is used as a candidate account. In this way, the candidate account is guaranteed to be inferior to the second account in account attribute parameters.
In some embodiments, prior to step 201, further comprising: screening the multiple candidate accounts according to screening parameters respectively corresponding to the multiple candidate accounts different from the second account to obtain a first account; wherein the screening parameters include at least one of: friend affinity with the second account; a geographic location distance from the second account; historical synchronization times with the second account; the number of occurrences in the historical operating record of the second account; account attribute parameters in the virtual scene; the account attribute parameters include at least one of a rank, a number of virtual resources, and a number of games.
Similarly, a plurality of candidate accounts different from the second account can be screened according to the screening parameters to obtain the first account. The objective of screening the first account may include at least one of the following: the friend affinity between the first account and the second account is highest; the geographic position distance between the first account and the second account is the smallest; the history synchronization times between the first account and the second account are the most; the first account number has the largest occurrence number in the historical operation record of the second account number; the account attribute parameter of the first account in the virtual scene is the lowest.
In some embodiments, prior to step 201, further comprising: receiving a synchronization request sent by a first account; the synchronization request comprises at least one of account identification of the first account and scene identification of the virtual scene; and responding to the confirmation operation for the synchronous request, and sending the confirmation information for the synchronous request to the first account, so that the first account synchronizes the virtual scene to the second man-machine interaction interface according to the confirmation information of the synchronous request.
Here, the received synchronization request sent by the first account may be presented in the second man-machine interaction interface, as shown in fig. 8B. And when the confirmation operation for the synchronous request is received in the second man-machine interaction interface, the confirmation information for the synchronous request is sent to the first account, so that the first account synchronizes the virtual scene to the second man-machine interaction interface according to the confirmation information for the synchronous request.
In some embodiments, prior to step 201, further comprising: responding to a synchronous triggering operation aiming at the virtual scene, and sending a synchronous request to the first account so as to enable the first account to synchronize the virtual scene to the second man-machine interaction interface according to the synchronous request; the synchronization request includes an account identifier of the second account.
Here, the second account may also actively initiate the synchronization request. For example, when a synchronization trigger operation for the virtual scene is received in the second man-machine interaction interface, a synchronization request is sent to the first account, so that the first account synchronizes the virtual scene to the second man-machine interaction interface according to the synchronization request (the first account can confirm the synchronization request and then synchronize the virtual scene). The synchronization request includes an account identifier of the second account.
In some embodiments, further comprising: presenting a synchronization entry for the virtual scene in a second human-computer interaction interface; and taking the triggering operation aiming at the synchronous entrance as the synchronous triggering operation aiming at the virtual scene.
Here, the synchronization portal for the virtual scene may be presented at any position in the second human-machine interaction interface, similar to fig. 7C. Then, the trigger operation for the synchronization portal may be regarded as the synchronization trigger operation for the virtual scene.
In some embodiments, prior to step 201, further comprising: and sending the synchronous region parameters to the first account so that the first account synchronizes the synchronous region corresponding to the synchronous region parameters in the virtual scene to the second man-machine interaction interface.
In some embodiments, between any of the steps, further comprising: and receiving the consultation information sent by the first account and presenting the consultation information in the second man-machine interaction interface.
Here, the consultation information may be presented at any position of the second man-machine interaction interface, as an example, see fig. 8C. Thus, the user using the second account is convenient to input accurate interaction information.
In step 202, interaction information for the virtual scene is received through a second human-computer interaction interface.
Here, the interaction information input by the user using the second account number for the virtual scene is received through the second man-machine interaction interface.
In some embodiments, when presenting the virtual scene synchronized based on the first man-machine interaction interface in the second man-machine interaction interface corresponding to the second account, the method further includes: establishing interactive connection between a first account and a second account; the interactive connection is used for sending the interactive information to the first account.
In some embodiments, the above-mentioned establishment of the interactive connection between the first account and the second account may be achieved by: establishing interactive connection between a first account and a second account according to the target media type; wherein the target media type represents the media type specified by the synchronization triggering operation; the synchronization triggering operation is used for triggering the synchronization of the virtual scene; the interactive connection is used for transmitting interactive information conforming to the target media type.
In some embodiments, further comprising: responding to an interaction triggering operation aiming at a virtual scene, and establishing interaction connection between a first account and a second account; the interactive connection is used for sending the interactive information to the first account.
The interactive triggering operation refers to the interactive triggering operation received in the second man-machine interaction interface.
In some embodiments, the above-mentioned interactive triggering operation in response to the virtual scene may be implemented in such a manner that an interactive connection between the first account and the second account is established: presenting an interaction inlet for the virtual scene in a second man-machine interaction interface; and taking the triggering operation aiming at the interaction entrance as the interaction triggering operation aiming at the virtual scene.
Here, the interaction portal for the virtual scene may be presented at any position in the second human-computer interaction interface, similar to fig. 7G. Then, the triggering operation for the interaction entrance can be used as the interaction triggering operation for the virtual scene.
In some embodiments, the above-mentioned interactive triggering operation in response to the virtual scene may be implemented in such a manner that an interactive connection between the first account and the second account is established: responding to the interaction triggering operation aiming at the virtual scene, and sending a connection request to a first account; the connection request comprises an account identifier of the second account; when the confirmation information of the first account number on the connection request is received, the interactive connection between the first account number and the second account number is established.
In some embodiments, further comprising: receiving a connection request sent by a first account based on a first man-machine interaction interface; the connection request comprises at least one of an account number identifier of the first account number and a scene identifier of the virtual scene; and taking the confirmation operation aiming at the connection request as the interaction triggering operation aiming at the virtual scene.
Here, the connection request sent by the first account may be presented on the second man-machine interaction interface, and a confirmation operation for the connection request may be received based on the second man-machine interaction interface.
In some embodiments, the above-mentioned establishment of the interactive connection between the first account and the second account may be achieved by: establishing interactive connection between a first account and a second account according to the target media type; wherein the target media type represents the media type specified by the interactive triggering operation; the interactive connection is used for transmitting interactive information conforming to the target media type.
In some embodiments, further comprising: presenting at least one of an interactive connection state, interactive information, ending an interactive portal and ending a synchronous portal in a second man-machine interactive interface; the interactive connection is used for sending the interactive information to the first account; ending the interactive portal for disconnecting the interactive connection when triggered; and the end synchronization entry is used for disconnecting the interactive connection when triggered and stopping presenting the virtual scene in the second man-machine interaction interface.
Here, at least one of the state of the interactive connection, the interactive information, the end of the interactive portal, and the end of the synchronization portal may be presented in the second man-machine interactive interface in a manner similar to fig. 7J.
In step 203, the interaction information is sent to the first account, so that the first account presents the interaction information in the first man-machine interaction interface; the interaction information is used for guiding the control operation of the first account in the virtual scene.
The interaction information received based on the second man-machine interaction interface is sent to the first account number to be presented in the first man-machine interaction interface.
As shown in fig. 5, the embodiment of the present application may help the first account to better control the virtual scene by effectively guiding the first account through the second account.
Referring to fig. 6, fig. 6 is a flowchart of an interaction method of a virtual scene according to an embodiment of the present application, and will be described with reference to steps shown in fig. 6 from the perspective of interaction between a terminal device 400 (a first terminal device) and a terminal device 500 (a second terminal device) shown in fig. 2.
In step 301, a first terminal device presents a virtual scene in a first man-machine interaction interface; the virtual scene is logged in based on the first account number.
In step 302, the first terminal device synchronizes the virtual scene to the second terminal device in response to the synchronization trigger operation for the virtual scene.
In step 303, the second terminal device presents the virtual scene synchronized based on the first man-machine interaction interface in the second man-machine interaction interface corresponding to the second account.
In step 304, the second terminal device receives the interaction information for the virtual scene through the second man-machine interaction interface.
In step 305, the second terminal device sends the interaction information to the first terminal device.
In step 306, the first terminal device presents the interaction information in the first man-machine interaction interface.
As shown in fig. 6, through interaction between the first terminal device and the second terminal device, the control effect of the first account number in the first terminal device on the virtual scene can be improved, and meanwhile, the actual utilization rate of the computing resources consumed by the first terminal device can be improved.
In the following, an exemplary application of the embodiment of the present application in an actual scene will be described, and for the sake of understanding, a case where the virtual scene is a virtual scene of a mahjong game will be described as an example. According to the embodiment of the application, a companion playing method can be provided for a player of a mahjong game, namely, a game field of an account A (corresponding to the first account) can be projected onto a screen of an account B (corresponding to the second account) through a screen sharing function, and the player using the account B can make a decision with the player using the account A according to real-time game conditions, wherein the screen is a man-machine interaction interface. For example, a novice player may invite an old player (senior player) to play himself/herself, and communicate by way of voice communication (communication is not limited to voice communication, but may be text communication, video communication, etc.). The new mode of the old belt can enable a novice player to quickly learn the playing logic of the mahjong game, improve experience and skills of the novice player, increase achievement sense of the old player, promote interaction between the new player and the old player, and improve retention rate of the player in the mahjong game.
Examples of applications for embodiments of the present application include, but are not limited to: 1) When a new player learns the mahjong playing method, a senior player can be invited to accompany and guide the playing of the mahjong, so that the game playing method can be quickly known, and the game fun is increased; 2) Players alone feel uninteresting in the game and can invite other players to watch the game and communicate and interact with each other.
Next, description will be made from the viewpoint of the account a. While account A is in the preparation phase of the game for the game, the player holding account A may choose to invite a companion player or choose to play the game alone. For account a, the corresponding first man-machine interaction interface may include a virtual scene and an interaction area isolated from each other, as an example, a schematic diagram of the first man-machine interaction interface corresponding to account a shown in fig. 9A is provided, fig. 9A shows a virtual scene 91 and a blank interaction area, and the virtual scene 91 includes a preparation entry 911, a synchronization entry 912, a synchronization entry 913, and an account identifier 914 of account a (taking an avatar of account a as an example in fig. 9A), where the preparation entry 911 is used to update a state of account a from an unprepared state to a prepared state when triggered (when states of four accounts participating in a game pair are all prepared states, starting the game pair); synchronization portal 912 and synchronization portal 913 are each used to provide the ability to invite a companion player; the account identifier 914 includes an avatar of the account a, and the number of virtual resources held by the account a, i.e., 1.07 ten thousand.
When a trigger operation for the synchronization portal 912 or the synchronization portal 913 is received, a plurality of candidate accounts may be presented in the first human-computer interaction interface, for example, accounts that have a friend relationship with the account a and are online in a mahjong game. Fig. 9B exemplarily shows 6 candidate accounts, for each of which, names (name 1, name 2, … … name 6 as shown in fig. 9B), avatars, and game ranks are shown, taking the case where the synchronization portal 913 is triggered as an example. It should be noted that, in addition to the account having a friend relationship with the account a in the mahjong game being used as a candidate account, the account having a friend relationship with the account a in the instant messaging application may be used as a candidate account, where the mahjong game and the instant messaging application share account data, and of course, the selection of the candidate account is not limited to the instant messaging application.
When a selection operation for a plurality of candidate accounts (e.g., a trigger operation for the invitation portal 915 shown in fig. 9B) is received, a synchronization request (a accompany-play invitation) may be sent to the selected candidate account (hereinafter, for convenience of distinction, account B). And when the confirmation information of the account B aiming at the synchronization request is received, synchronizing (sharing) the virtual scene presented by the first man-machine interaction interface to the second man-machine interaction interface corresponding to the account B.
Based on the synchronized virtual scene, account A or account B may initiate a wheat connection. As an example, an interaction entry 92 and an end synchronization entry 93 are shown in the interaction region of fig. 9C. Upon receiving the triggering operation for the interaction portal 92, a connection request (a connection request) is sent to the account B, the connection request including the name of the account a, and fig. 9D provides a connection request presented in the interaction area of the second human-computer interaction interface, as an example. When the confirmation information of the account B for the connection request is received, the voice communication connection between the account a and the account B is established, as shown in fig. 9E, a state of the voice communication connection may be presented in the interaction area of the first man-machine interaction interface (i.e. "being connected with the account B") and the interaction portal 94 is ended. When a triggering operation for ending the synchronization entrance 93 is received, the connection between the account A and the account B is disconnected, and the synchronization of the virtual scene to the second human-computer interaction interface is stopped; when a trigger operation for ending the interaction portal 94 is received, the connection between account a and account B is disconnected, but the synchronization of the virtual scene to the second human-machine interaction interface is still maintained.
Next, description will be made from the viewpoint of the account B. For account B, the synchronization request may be received in a mahjong game (e.g., a casino of a mahjong game) or an instant messaging application, although not limited thereto. As an example, a schematic diagram of a second man-machine interaction interface corresponding to the account B is provided, as shown in fig. 10A, a synchronization request sent by the account a may be presented in the second man-machine interaction interface, where the synchronization request may include a name of the account a, and a scene identifier (such as a mahjong high-level field) of a virtual scene that needs to be synchronized. And when receiving triggering operation for the 'accept' control, sending confirmation information for the synchronous request to the account A so as to prompt the account A to synchronize the virtual scene.
For the virtual scene synchronized from account a to account B, the virtual scene may be presented in a second human-machine interaction interface, as an example, a schematic diagram as in fig. 10B is provided. In fig. 10B, a virtual scene 101 synchronized by an account a, and an interactive area different from the area where the virtual scene 101 is located are shown, the interactive area including an interactive portal 102 and an ending synchronization portal 103, wherein the interactive portal 102 functions similarly to the interactive portal 92 above, and the ending synchronization portal 103 functions similarly to the ending synchronization portal 93 above. Upon receiving a triggering operation for the interactive portal 102, a connection request is sent to the account a, the connection request including the name of the account B, and fig. 10C provides, as an example, a connection request presented in the interactive area of the first man-machine interactive interface. When the confirmation information of the account a for the connection request is received, the voice communication connection between the account a and the account B is established, as shown in fig. 10D, a state of the voice communication connection may be presented in the interaction area of the second man-machine interaction interface (i.e. "being connected with the account a") and the ending interaction portal 104 may be presented, and the ending interaction portal 104 may function similarly to the ending interaction portal 94 above.
Next, description will be made from the viewpoint of the underlying implementation. The interaction scheme of the virtual scene provided by the embodiment of the application can comprise two parts of screen view sharing and voice communication, as shown in fig. 11, the screen view sharing refers to that the view of an account A is shared to an account B in real time, and players using the account B can earn out a plan for the players using the account A according to the shared picture; the voice call refers to establishing a voice call connection (corresponding to the interactive connection above) between the account a and the account B, so as to facilitate communication between the player using the account B and the player using the account a.
In addition, a flow diagram of an interaction method of the virtual scene is provided, as shown in fig. 12, when a player enters a game for game through an account a, the player can choose not to invite, namely, play the game alone; the invitation account B may also be selected. The account B may choose to accept or reject when receiving the invitation to accompany the play (i.e., the synchronization request) sent by the account a. When the accompanying play invitation sent by the account A is refused, the accompanying play invitation can be sent to other accounts. And synchronizing the virtual scene in the first man-machine interaction interface corresponding to the account A to the second man-machine interaction interface corresponding to the account B when the account B receives the accompanying play invitation sent by the account A. Either account a or account B may send a voice invitation to the peer to make a voice connection.
It should be noted that, the techniques used for sharing the viewing angle of the screen and for voice communication are not limited in the embodiments of the present application.
Continuing with the description below, the virtual scene interaction device 455 provided by embodiments of the present application is implemented as an exemplary structure of software modules, and in some embodiments, as shown in fig. 3A, the software modules stored in the virtual scene interaction device 455 of the memory 450 may include: the scene presenting module 4551 is configured to present a virtual scene in the first human-computer interaction interface; the virtual scene is logged in based on a first account; the scene synchronization module 4552 is configured to synchronize the virtual scene to a second human-computer interaction interface corresponding to the second account in response to a synchronization trigger operation for the virtual scene; the information presentation module 4553 is configured to receive the interaction information sent by the second account based on the second man-machine interaction interface, and present the interaction information in the first man-machine interaction interface; the interaction information is used for guiding the control operation of the first account in the virtual scene.
In some embodiments, the scene synchronization module 4552 is further configured to: presenting a plurality of candidate accounts different from the first account in a first human-computer interaction interface; and responding to the selection operation of the plurality of candidate accounts, and taking the selected candidate account as a second account.
In some embodiments, the scene synchronization module 4552 is further configured to perform at least one of: taking an account with a friend relation with the first account as a candidate account; determining a synchronous geographic range centering on the geographic position of the first account, and taking the account in the synchronous geographic range as a candidate account; taking the history synchronous account number of the first account number as a candidate account number; wherein the history synchronization account represents an account that historically was synchronized by the first account with the virtual scene; determining a historical operation record of the first account in the virtual scene, and taking an account with a target relation with the first account in the historical operation record as a candidate account; wherein the target relationship includes at least one of a synergistic relationship, an antagonistic relationship, and a neutral relationship; determining parameter conditions according to account attribute parameters of the first account in the virtual scene, and taking the account with the account attribute parameters meeting the parameter conditions as a candidate account; wherein the account attribute parameters include at least one of a level, a number of virtual resources, and a number of games.
In some embodiments, the scene synchronization module 4552 is further configured to: screening the multiple candidate accounts according to screening parameters respectively corresponding to the multiple candidate accounts different from the first account to obtain a second account; wherein the screening parameters include at least one of: friend affinity with the first account; a geographic location distance from the first account; historical synchronization times with the first account; the number of occurrences in the historical operating record of the first account; account attribute parameters in the virtual scene; the account attribute parameters include at least one of a rank, a number of virtual resources, and a number of games.
In some embodiments, the scene synchronization module 4552 is further configured to: presenting a synchronization entry for the virtual scene in the first human-computer interaction interface; and taking the triggering operation aiming at the synchronous entrance as the synchronous triggering operation aiming at the virtual scene.
In some embodiments, the scene synchronization module 4552 is further configured to: responding to a synchronous triggering operation aiming at the virtual scene, and sending a synchronous request to a second account; the synchronization request comprises at least one of account identification of the first account and scene identification of the virtual scene; and synchronizing the virtual scene to a second man-machine interaction interface corresponding to the second account when receiving the confirmation information of the second account for the synchronization request.
In some embodiments, the scene synchronization module 4552 is further configured to: presenting a synchronous request sent by a second account in a first man-machine interaction interface; the synchronization request comprises an account identifier of the second account; and taking the confirmation operation for the synchronous request as a synchronous triggering operation for the virtual scene.
In some embodiments, the scene synchronization module 4552 is further configured to: establishing interactive connection between a first account and a second account; the interactive connection is used for receiving the interactive information sent by the second account based on the second man-machine interaction interface.
In some embodiments, the scene synchronization module 4552 is further configured to: establishing interactive connection between a first account and a second account according to the target media type; wherein the target media type represents the media type specified by the synchronization triggering operation; the interactive connection is used for transmitting interactive information conforming to the target media type.
In some embodiments, the interaction device 455 of the virtual scene further includes a first connection establishment module for establishing an interactive connection between the first account and the second account in response to an interactive triggering operation for the virtual scene; the interactive connection is used for receiving the interactive information sent by the second account based on the second man-machine interaction interface.
In some embodiments, the first connection establishment module is further to: presenting an interaction inlet for the virtual scene in a first human-computer interaction interface; and taking the triggering operation aiming at the interaction entrance as the interaction triggering operation aiming at the virtual scene.
In some embodiments, the first connection establishment module is further to: responding to the interaction triggering operation aiming at the virtual scene, and sending a connection request to a second account; the connection request comprises at least one of account identification of the first account and scene identification of the virtual scene; and when receiving the confirmation information of the second account for the connection request, establishing interactive connection between the first account and the second account.
In some embodiments, the first connection establishment module is further to: receiving a connection request sent by a second account based on a second man-machine interaction interface; the connection request comprises an account identifier of the second account; and taking the confirmation operation aiming at the connection request as the interaction triggering operation aiming at the virtual scene.
In some embodiments, the first connection establishment module is further to: establishing interactive connection between a first account and a second account according to the target media type; wherein the target media type represents the media type specified by the interactive triggering operation; the interactive connection is used for transmitting interactive information conforming to the target media type.
In some embodiments, the information presentation module 4553 is further configured to: presenting at least one of a state of interactive connection, interactive information, ending an interactive portal and ending a synchronous portal in a first man-machine interactive interface; the interactive connection is used for receiving the interactive information sent by the second account; ending the interactive portal for disconnecting the interactive connection when triggered; and the end synchronization entry is used for disconnecting the interactive connection when triggered and stopping synchronizing the virtual scene to the second man-machine interaction interface.
In some embodiments, the scene synchronization module 4552 is further configured to: when the synchronization condition is met, synchronizing the virtual scene to a second human-computer interaction interface corresponding to a second account; wherein the synchronization condition includes at least one of: the duration of the first account in the virtual scene, which is not operated, reaches a duration threshold; the account attribute parameter of the first account in the virtual scene is smaller than a parameter threshold; wherein the account attribute parameters include at least one of a level, a number of virtual resources, and a number of rounds; the predicted odds of the first account number from the image data of the virtual scene is less than a Yu Sheng rate threshold.
In some embodiments, the scene synchronization module 4552 is further configured to: any one of the following processes is performed: responding to a synchronous region selection operation aiming at the virtual scene, and synchronizing the selected synchronous region to a second human-computer interaction interface; sensitive information shielding processing is carried out on the virtual scene, and the shielded virtual scene is synchronized to a second human-computer interaction interface; and receiving the synchronous region parameters sent by the second account, and synchronizing the synchronous region corresponding to the synchronous region parameters in the virtual scene to the second man-machine interaction interface.
In some embodiments, the scene synchronization module 4552 is further configured to: carrying out semantic extraction processing on the image data of the virtual scene to obtain consultation information; and sending the consultation information to the second account.
Continuing with the description below, the interaction device 555 for a virtual scene provided by embodiments of the present application is implemented as an exemplary structure of software modules, and in some embodiments, as shown in fig. 3B, the software modules stored in the interaction device 555 for a virtual scene in the memory 550 may include: the synchronous presenting module 5551 is configured to present a virtual scene synchronized based on the first man-machine interaction interface in a second man-machine interaction interface corresponding to the second account; the virtual scene is logged in based on a first account; the receiving module 5552 is configured to receive interaction information for the virtual scene through the second human-computer interaction interface; the sending module 5553 is configured to send the interaction information to the first account, so that the first account presents the interaction information in the first human-computer interaction interface; the interaction information is used for guiding the control operation of the first account in the virtual scene.
In some embodiments, the synchronous presentation module 5551 is further to: presenting a plurality of candidate accounts different from the second account in a second human-computer interaction interface; and responding to the selection operation of the plurality of candidate accounts, and taking the selected candidate account as a first account.
In some embodiments, the synchronous rendering module 5551 is further configured to perform at least one of the following: taking an account with a friend relation with the second account as a candidate account; determining a synchronous geographic range centering on the geographic position of the second account, and taking the account in the synchronous geographic range as a candidate account; taking the history synchronous account number of the second account number as a candidate account number; wherein the history synchronization account represents an account that historically was synchronized with the virtual scene by the second account; determining a historical operation record of the second account in the virtual scene, and taking an account with a target relation with the second account in the historical operation record as a candidate account; wherein the target relationship includes at least one of a synergistic relationship, an antagonistic relationship, and a neutral relationship; determining parameter conditions according to account attribute parameters of the second account in the virtual scene, and taking the account with the account attribute parameters meeting the parameter conditions as a candidate account; wherein the account attribute parameters include at least one of a level, a number of virtual resources, and a number of games.
In some embodiments, the synchronous presentation module 5551 is further to: screening the multiple candidate accounts according to screening parameters respectively corresponding to the multiple candidate accounts different from the second account to obtain a first account; wherein the screening parameters include at least one of: friend affinity with the second account; a geographic location distance from the second account; historical synchronization times with the second account; the number of occurrences in the historical operating record of the second account; account attribute parameters in the virtual scene; the account attribute parameters include at least one of a rank, a number of virtual resources, and a number of games.
In some embodiments, the synchronous presentation module 5551 is further to: receiving a synchronization request sent by a first account; the synchronization request comprises at least one of account identification of the first account and scene identification of the virtual scene; and responding to the confirmation operation for the synchronous request, and sending the confirmation information for the synchronous request to the first account, so that the first account synchronizes the virtual scene to the second man-machine interaction interface according to the confirmation information of the synchronous request.
In some embodiments, the synchronous presentation module 5551 is further to: responding to a synchronous triggering operation aiming at the virtual scene, and sending a synchronous request to the first account so as to enable the first account to synchronize the virtual scene to the second man-machine interaction interface according to the synchronous request; the synchronization request includes an account identifier of the second account.
In some embodiments, the synchronous presentation module 5551 is further to: presenting a synchronization entry for the virtual scene in a second human-computer interaction interface; and taking the triggering operation aiming at the synchronous entrance as the synchronous triggering operation aiming at the virtual scene.
In some embodiments, the synchronous presentation module 5551 is further to: establishing interactive connection between a first account and a second account; the interactive connection is used for sending the interactive information to the first account.
In some embodiments, the synchronous presentation module 5551 is further to: establishing interactive connection between a first account and a second account according to the target media type; wherein the target media type represents the media type specified by the synchronization triggering operation; the synchronization triggering operation is used for triggering the synchronization of the virtual scene; the interactive connection is used for transmitting interactive information conforming to the target media type.
In some embodiments, the interaction device 555 of the virtual scene further includes a second connection establishment module configured to: responding to an interaction triggering operation aiming at a virtual scene, and establishing interaction connection between a first account and a second account; the interactive connection is used for sending the interactive information to the first account.
In some embodiments, the second connection establishment module is further to: presenting an interaction inlet for the virtual scene in a second man-machine interaction interface; and taking the triggering operation aiming at the interaction entrance as the interaction triggering operation aiming at the virtual scene.
In some embodiments, the second connection establishment module is further to: responding to the interaction triggering operation aiming at the virtual scene, and sending a connection request to a first account; the connection request comprises an account identifier of the second account; when the confirmation information of the first account number on the connection request is received, the interactive connection between the first account number and the second account number is established.
In some embodiments, the second connection establishment module is further to: receiving a connection request sent by a first account based on a first man-machine interaction interface; the connection request comprises at least one of an account number identifier of the first account number and a scene identifier of the virtual scene; and taking the confirmation operation aiming at the connection request as the interaction triggering operation aiming at the virtual scene.
In some embodiments, the second connection establishment module is further to: establishing interactive connection between a first account and a second account according to the target media type; wherein the target media type represents the media type specified by the interactive triggering operation; the interactive connection is used for transmitting interactive information conforming to the target media type.
In some embodiments, the synchronous presentation module 5551 is further to: presenting at least one of an interactive connection state, interactive information, ending an interactive portal and ending a synchronous portal in a second man-machine interactive interface; the interactive connection is used for sending the interactive information to the first account; ending the interactive portal for disconnecting the interactive connection when triggered; and the end synchronization entry is used for disconnecting the interactive connection when triggered and stopping presenting the virtual scene in the second man-machine interaction interface.
In some embodiments, the synchronous presentation module 5551 is further to: and sending the synchronous region parameters to the first account so that the first account synchronizes the synchronous region corresponding to the synchronous region parameters in the virtual scene to the second man-machine interaction interface.
In some embodiments, the synchronous presentation module 5551 is further to: and receiving the consultation information sent by the first account and presenting the consultation information in the second man-machine interaction interface.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions (i.e., executable instructions) stored in a computer readable storage medium. The processor of the electronic device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the electronic device executes the interaction method of the virtual scene according to the embodiment of the application.
The embodiment of the application provides a computer readable storage medium, wherein executable instructions are stored, and when the executable instructions are executed by a processor, the processor is caused to execute the interaction method of the virtual scene provided by the embodiment of the application.
In some embodiments, the computer readable storage medium may be FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; but may be a variety of devices including one or any combination of the above memories.
In some embodiments, the executable instructions may be in the form of programs, software modules, scripts, or code, written in any form of programming language (including compiled or interpreted languages, or declarative or procedural languages), and they may be deployed in any form, including as stand-alone programs or as modules, components, subroutines, or other units suitable for use in a computing environment.
As an example, the executable instructions may, but need not, correspond to files in a file system, may be stored as part of a file that holds other programs or data, for example, in one or more scripts in a hypertext markup language (HTML, hyper Text Markup Language) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
As an example, executable instructions may be deployed to be executed on one electronic device or on multiple electronic devices located at one site or, alternatively, on multiple electronic devices distributed across multiple sites and interconnected by a communication network.
The above is merely an example of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and scope of the present application are included in the protection scope of the present application.

Claims (22)

1. A method for interaction of a virtual scene, the method comprising:
presenting a virtual scene in a first human-computer interaction interface, wherein the virtual scene is logged in based on a first account;
responding to the synchronous triggering operation aiming at the virtual scene, and synchronizing the virtual scene to a second man-machine interaction interface corresponding to a second account, wherein the mode of synchronizing the virtual scene to the second man-machine interaction interface comprises the following steps: responding to a synchronous region selection operation aiming at the virtual scene, and synchronizing the selected synchronous region to the second man-machine interaction interface; or receiving a synchronous region parameter sent by the second account, and synchronizing a synchronous region corresponding to the synchronous region parameter in the virtual scene to the second man-machine interaction interface;
carrying out semantic extraction processing on the image data of the virtual scene to obtain semantic information in the virtual scene;
Adding a structured sentence into the semantic information to obtain consultation information, and sending the consultation information to the second account;
and receiving interaction information sent by the second account based on the second man-machine interaction interface, and presenting the interaction information in the first man-machine interaction interface, wherein the interaction information is used for guiding the control operation of the first account in the virtual scene.
2. The method of claim 1, wherein prior to synchronizing the virtual scene to the second human-machine interface corresponding to the second account, the method further comprises:
presenting a plurality of candidate accounts different from the first account in the first human-computer interaction interface;
and responding to the selection operation of the plurality of candidate accounts, and taking the selected candidate account as a second account.
3. The method according to claim 2, wherein the method further comprises:
performing at least one of the following:
taking an account with a friend relation with the first account as a candidate account;
determining a synchronous geographic range taking the geographic position of the first account as the center, and taking the account in the synchronous geographic range as a candidate account;
Taking the history synchronous account number of the first account number as a candidate account number; wherein the history synchronization account represents an account of a virtual scene historically synchronized by the first account;
determining a historical operation record of the first account in the virtual scene, and taking an account with a target relation with the first account in the historical operation record as a candidate account; wherein the target relationship includes at least one of a synergistic relationship, an antagonistic relationship, and a neutral relationship;
determining a parameter condition according to the account attribute parameters of the first account in the virtual scene, and taking the account with the account attribute parameters meeting the parameter condition as a candidate account; wherein the account attribute parameters include at least one of a level, a number of virtual resources, and a number of games.
4. The method of claim 1, wherein prior to synchronizing the virtual scene to the second human-machine interface corresponding to the second account, the method further comprises:
screening the multiple candidate accounts according to screening parameters respectively corresponding to the multiple candidate accounts different from the first account to obtain a second account;
Wherein the screening parameters include at least one of:
friend affinity with the first account;
a geographic location distance from the first account;
historical synchronization times with the first account;
the number of occurrences in the historical operating record of the first account;
account attribute parameters in the virtual scene; the account attribute parameters include at least one of a level, a number of virtual resources, and a number of games.
5. The method according to claim 1, wherein the method further comprises:
presenting a synchronization entry for the virtual scene in the first human-computer interaction interface;
and taking the triggering operation aiming at the synchronous entrance as the synchronous triggering operation aiming at the virtual scene.
6. The method of claim 1, wherein synchronizing the virtual scene to a second human-machine interaction interface corresponding to a second account in response to a synchronization trigger operation for the virtual scene comprises:
responding to a synchronous triggering operation aiming at the virtual scene, and sending a synchronous request to the second account; wherein the synchronization request comprises at least one of an account identifier of the first account and a scene identifier of the virtual scene;
And synchronizing the virtual scene to a second man-machine interaction interface corresponding to the second account when receiving the confirmation information of the second account for the synchronization request.
7. The method according to claim 1, wherein the method further comprises:
presenting the synchronous request sent by the second account in the first man-machine interaction interface; the synchronization request comprises an account identifier of the second account;
and taking the confirmation operation aiming at the synchronous request as a synchronous triggering operation aiming at the virtual scene.
8. The method of claim 1, wherein when synchronizing the virtual scene to a second human-machine interface corresponding to a second account, the method further comprises:
establishing interactive connection between the first account and the second account;
the interactive connection is used for receiving the interactive information sent by the second account based on the second man-machine interactive interface.
9. The method of claim 8, wherein the establishing an interactive connection between the first account and the second account comprises:
establishing interactive connection between the first account and the second account according to the target media type;
Wherein the target media type represents the media type specified by the synchronization trigger operation; the interactive connection is used for transmitting interactive information conforming to the target media type.
10. The method according to claim 1, wherein the method further comprises:
responding to the interaction triggering operation aiming at the virtual scene, and establishing interaction connection between the first account and the second account;
the interactive connection is used for receiving the interactive information sent by the second account based on the second man-machine interactive interface.
11. The method of claim 10, wherein the establishing an interactive connection between the first account and the second account in response to the interactive triggering operation for the virtual scene comprises:
presenting an interaction inlet for the virtual scene in the first man-machine interaction interface;
and taking the triggering operation aiming at the interaction entrance as the interaction triggering operation aiming at the virtual scene.
12. The method of claim 10, wherein the establishing an interactive connection between the first account and the second account in response to the interactive triggering operation for the virtual scene comprises:
Responding to the interaction triggering operation aiming at the virtual scene, and sending a connection request to the second account; the connection request comprises at least one of an account identifier of the first account and a scene identifier of the virtual scene;
and when receiving the confirmation information of the second account to the connection request, establishing interactive connection between the first account and the second account.
13. The method according to claim 10, wherein the method further comprises:
receiving a connection request sent by the second account based on the second man-machine interaction interface; the connection request comprises an account identifier of the second account;
and taking the confirmation operation aiming at the connection request as the interaction triggering operation aiming at the virtual scene.
14. The method of claim 10, wherein the establishing an interactive connection between the first account and the second account comprises:
establishing interactive connection between the first account and the second account according to the target media type;
wherein the target media type represents the media type specified by the interactive triggering operation; the interactive connection is used for transmitting interactive information conforming to the target media type.
15. The method according to any one of claims 1 to 14, further comprising:
presenting at least one of a state of interactive connection, the interactive information, an ending interactive portal and an ending synchronous portal in the first man-machine interactive interface;
the interactive connection is used for receiving the interactive information sent by the second account; the ending interaction entrance is used for disconnecting the interaction connection when triggered; and the end synchronization inlet is used for disconnecting the interactive connection when triggered and stopping synchronizing the virtual scene to the second man-machine interaction interface.
16. The method of any one of claims 1 to 14, wherein synchronizing the virtual scene to a second human-machine interaction interface corresponding to a second account number comprises:
when the synchronization condition is met, synchronizing the virtual scene to a second human-computer interaction interface corresponding to a second account;
wherein the synchronization condition includes at least one of:
the unoperated duration of the first account in the virtual scene reaches a duration threshold;
the account attribute parameter of the first account in the virtual scene is smaller than a parameter threshold; wherein the account attribute parameters include at least one of a level, a number of virtual resources, and a number of games;
The predicted winning rate of the first account number according to the image data of the virtual scene is smaller than a Yu Sheng rate threshold.
17. The method of any one of claims 1 to 14, wherein synchronizing the virtual scene to a second human-machine interaction interface corresponding to a second account number comprises:
and carrying out sensitive information shielding processing on the virtual scene, and synchronizing the shielded virtual scene to the second man-machine interaction interface.
18. A method for interaction of a virtual scene, the method comprising:
presenting a virtual scene synchronized based on a first man-machine interaction interface in a second man-machine interaction interface corresponding to a second account, wherein the virtual scene is logged in based on the first account, and the manner of presenting the virtual scene in the second man-machine interaction interface comprises: responding to synchronous region selection operation aiming at the virtual scene, and presenting the selected synchronous region in the second man-machine interaction interface; or receiving a synchronous region parameter sent by the first account, and presenting a synchronous region corresponding to the synchronous region parameter in the virtual scene in the second man-machine interaction interface;
Receiving consultation information sent by the first account, wherein the consultation information is obtained by carrying out semantic extraction processing on image data of the virtual scene and adding a structured sentence into the extracted semantic information;
receiving interaction information aiming at the virtual scene through the second man-machine interaction interface;
and sending the interaction information to the first account so that the first account presents the interaction information in the first man-machine interaction interface, wherein the interaction information is used for guiding the control operation of the first account in the virtual scene.
19. An interactive apparatus for a virtual scene, the apparatus comprising:
the scene presenting module is used for presenting a virtual scene in the first man-machine interaction interface, wherein the virtual scene is logged in based on a first account;
the scene synchronization module is configured to synchronize the virtual scene to a second human-computer interaction interface corresponding to a second account in response to a synchronization triggering operation for the virtual scene, where a manner of synchronizing the virtual scene to the second human-computer interaction interface includes: responding to a synchronous region selection operation aiming at the virtual scene, and synchronizing the selected synchronous region to the second man-machine interaction interface; or receiving a synchronous region parameter sent by the second account, and synchronizing a synchronous region corresponding to the synchronous region parameter in the virtual scene to the second man-machine interaction interface; carrying out semantic extraction processing on the image data of the virtual scene to obtain semantic information in the virtual scene; adding a structured sentence into the semantic information to obtain consultation information, and sending the consultation information to the second account;
The information presentation module is used for receiving the interaction information sent by the second account based on the second man-machine interaction interface and presenting the interaction information in the first man-machine interaction interface, wherein the interaction information is used for guiding the control operation of the first account in the virtual scene.
20. An interactive apparatus for a virtual scene, the apparatus comprising:
the synchronous presenting module is used for presenting the virtual scene based on the synchronization of the first man-machine interaction interface in the second man-machine interaction interface corresponding to the second account; the method for presenting the virtual scene in the second man-machine interaction interface based on the login of the first account comprises the following steps: responding to synchronous region selection operation aiming at the virtual scene, and presenting the selected synchronous region in the second man-machine interaction interface; or receiving a synchronous region parameter sent by the first account, and presenting a synchronous region corresponding to the synchronous region parameter in the virtual scene in the second man-machine interaction interface;
the receiving module is used for receiving the consultation information sent by the first account, wherein the consultation information is obtained by carrying out semantic extraction processing on the image data of the virtual scene and adding a structured sentence into the extracted semantic information; receiving interaction information aiming at the virtual scene through the second man-machine interaction interface;
The sending module is used for sending the interaction information to the first account so that the first account presents the interaction information in the first man-machine interaction interface; the interaction information is used for guiding the control operation of the first account in the virtual scene.
21. An electronic device, the electronic device comprising:
a memory for storing executable instructions;
a processor configured to implement the method for interaction of virtual scenes according to any of claims 1 to 17 or the method for interaction of virtual scenes according to claim 18 when executing the executable instructions stored in the memory.
22. A computer readable storage medium storing executable instructions which when executed by a processor implement the method of interaction of virtual scenes according to any of claims 1 to 17 or the method of interaction of virtual scenes according to claim 18.
CN202111141959.3A 2021-09-28 2021-09-28 Virtual scene interaction method and device and electronic equipment Active CN113769395B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111141959.3A CN113769395B (en) 2021-09-28 2021-09-28 Virtual scene interaction method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111141959.3A CN113769395B (en) 2021-09-28 2021-09-28 Virtual scene interaction method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN113769395A CN113769395A (en) 2021-12-10
CN113769395B true CN113769395B (en) 2023-11-14

Family

ID=78854084

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111141959.3A Active CN113769395B (en) 2021-09-28 2021-09-28 Virtual scene interaction method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113769395B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117654061A (en) * 2022-08-23 2024-03-08 腾讯科技(深圳)有限公司 Object control method, device, electronic apparatus, storage medium, and program product

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009149112A1 (en) * 2008-06-03 2009-12-10 Tweedletech, Llc An intelligent game system for putting intelligence into board and tabletop games including miniatures
CN102868928A (en) * 2011-07-05 2013-01-09 腾讯科技(深圳)有限公司 Video image display method and device implementing closed caption
CN107930129A (en) * 2017-11-30 2018-04-20 网易(杭州)网络有限公司 Communication means, medium, device and computing device based on virtual scene
CN108040004A (en) * 2018-01-29 2018-05-15 上海壹账通金融科技有限公司 Control method, device, equipment and the readable storage medium storing program for executing of virtual robot
CN109495711A (en) * 2018-12-29 2019-03-19 南京维沃软件技术有限公司 Video calling processing method sends terminal, receives terminal and electronic equipment
CN110071910A (en) * 2019-03-15 2019-07-30 平安普惠企业管理有限公司 Equipment is synchronous to assist control method, device, computer equipment and storage medium
EP3520868A1 (en) * 2018-02-06 2019-08-07 Gree, Inc. Game processing system, method of processing game, and program for processing game
CN111672132A (en) * 2020-06-03 2020-09-18 西安万像电子科技有限公司 Game control method, game control device, server, and storage medium
CN112241450A (en) * 2020-03-23 2021-01-19 北京来也网络科技有限公司 Question and answer sentence processing method, device, equipment and storage medium combining RPA and AI
CN112402963A (en) * 2020-11-20 2021-02-26 腾讯科技(深圳)有限公司 Information sending method, device, equipment and storage medium in virtual scene

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009149112A1 (en) * 2008-06-03 2009-12-10 Tweedletech, Llc An intelligent game system for putting intelligence into board and tabletop games including miniatures
CN102868928A (en) * 2011-07-05 2013-01-09 腾讯科技(深圳)有限公司 Video image display method and device implementing closed caption
CN107930129A (en) * 2017-11-30 2018-04-20 网易(杭州)网络有限公司 Communication means, medium, device and computing device based on virtual scene
CN108040004A (en) * 2018-01-29 2018-05-15 上海壹账通金融科技有限公司 Control method, device, equipment and the readable storage medium storing program for executing of virtual robot
EP3520868A1 (en) * 2018-02-06 2019-08-07 Gree, Inc. Game processing system, method of processing game, and program for processing game
CN109495711A (en) * 2018-12-29 2019-03-19 南京维沃软件技术有限公司 Video calling processing method sends terminal, receives terminal and electronic equipment
CN110071910A (en) * 2019-03-15 2019-07-30 平安普惠企业管理有限公司 Equipment is synchronous to assist control method, device, computer equipment and storage medium
CN112241450A (en) * 2020-03-23 2021-01-19 北京来也网络科技有限公司 Question and answer sentence processing method, device, equipment and storage medium combining RPA and AI
CN111672132A (en) * 2020-06-03 2020-09-18 西安万像电子科技有限公司 Game control method, game control device, server, and storage medium
CN112402963A (en) * 2020-11-20 2021-02-26 腾讯科技(深圳)有限公司 Information sending method, device, equipment and storage medium in virtual scene

Also Published As

Publication number Publication date
CN113769395A (en) 2021-12-10

Similar Documents

Publication Publication Date Title
CN110536725A (en) Personalized user interface based on behavior in application program
CN111240544B (en) Data processing method, device and equipment for virtual scene and storage medium
US11559737B2 (en) Video modification and transmission using tokens
CN112569599B (en) Control method and device for virtual object in virtual scene and electronic equipment
CN114247141A (en) Method, device, equipment, medium and program product for guiding task in virtual scene
TWI796844B (en) Method for displaying voting result, device, apparatus, storage medium and program product
CN112306321B (en) Information display method, device and equipment and computer readable storage medium
CN113824983B (en) Data matching method, device, equipment and computer readable storage medium
WO2023088024A1 (en) Virtual scene interactive processing method and apparatus, and electronic device, computer-readable storage medium and computer program product
CN111672132B (en) Game control method, game control device, server, and storage medium
CN114272617A (en) Virtual resource processing method, device, equipment and storage medium in virtual scene
CN113769395B (en) Virtual scene interaction method and device and electronic equipment
JP5586771B1 (en) Server and method for providing game
CN114296597A (en) Object interaction method, device, equipment and storage medium in virtual scene
CN112995687B (en) Interaction method, device, equipment and medium based on Internet
CN112007360A (en) Processing method and device for monitoring functional prop and electronic equipment
CN113058265B (en) Interaction method, device, equipment and storage medium between teams in virtual scene
US20240024778A1 (en) Updating gameplay parameters based on parameters shown in gameplay video
WO2024041152A1 (en) Method for controlling object in virtual scene, apparatus, electronic device, computer-readable storage medium and computer program product
WO2024060924A1 (en) Interaction processing method and apparatus for virtual scene, and electronic device and storage medium
CN113769396B (en) Interactive processing method, device, equipment, medium and program product of virtual scene
WO2024051398A1 (en) Virtual scene interaction processing method and apparatus, electronic device and storage medium
CN116983632A (en) Virtual scene effect display method, device, equipment, medium and program product
CN116966554A (en) Interactive processing method and device for virtual scene, electronic equipment and storage medium
CN116943245A (en) Interactive processing method and device for virtual scene, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant