CN117785360A - Interaction method, device, equipment and storage medium - Google Patents

Interaction method, device, equipment and storage medium Download PDF

Info

Publication number
CN117785360A
CN117785360A CN202311801558.5A CN202311801558A CN117785360A CN 117785360 A CN117785360 A CN 117785360A CN 202311801558 A CN202311801558 A CN 202311801558A CN 117785360 A CN117785360 A CN 117785360A
Authority
CN
China
Prior art keywords
target
interface
virtual object
target object
interactions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311801558.5A
Other languages
Chinese (zh)
Inventor
周子君
张恬甜
池承
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202311801558.5A priority Critical patent/CN117785360A/en
Publication of CN117785360A publication Critical patent/CN117785360A/en
Pending legal-status Critical Current

Links

Abstract

Embodiments of the present disclosure relate to interaction methods, apparatuses, devices, and storage media. The method proposed herein comprises: presenting a target interface associated with the target object; providing an access entry associated with the virtual object in the target interface in response to determining that the target object is configured with the corresponding virtual object; and presenting an interactive interface with the virtual object based on the selection of the access portal, the interactive interface for supporting multi-modal interactions between the current user and the virtual object. In this way, embodiments of the present disclosure can support a user accessing an interactive interface with a corresponding virtual object through an access portal provided in a related interface of a target object.

Description

Interaction method, device, equipment and storage medium
Technical Field
Example embodiments of the present disclosure relate generally to the field of computers and, more particularly, relate to interactive methods, apparatus, devices, and computer-readable storage media.
Background
With the development of computer technology, the internet has become an important platform for people to interact information. For example, people may share various works through a platform, and may also obtain works shared by other users. How to improve the efficiency of information interaction of users in the internet is an important point of attention.
Disclosure of Invention
In a first aspect of the present disclosure, an interaction method is provided. The method comprises the following steps: presenting a target interface associated with the target object; providing an access entry associated with the virtual object in the target interface in response to determining that the target object is configured with the corresponding virtual object; and presenting an interactive interface with the virtual object based on the selection of the access portal, the interactive interface for supporting multi-modal interactions between the current user and the virtual object.
In a second aspect of the present disclosure, an interaction device is provided. The device comprises: a first presentation module configured to present a target interface associated with a target object; an entry providing module configured to maintain an access entry associated with the virtual object in the target interface in response to determining that the target object is configured with a corresponding virtual object; and a second presentation module configured to present an interactive interface with the virtual object based on the selection of the access portal, the interactive interface for supporting multi-modal interactions between a current user and the virtual object.
In a third aspect of the present disclosure, an electronic device is provided. The apparatus comprises at least one processing unit; and at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit. The instructions, when executed by at least one processing unit, cause the apparatus to perform the method of the first aspect.
In a fourth aspect of the present disclosure, a computer-readable storage medium is provided. The computer readable storage medium has stored thereon a computer program executable by a processor to implement the method of the first aspect.
It should be understood that what is described in this section of the disclosure is not intended to limit key features or essential features of the embodiments of the disclosure, nor is it intended to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, wherein like or similar reference numerals denote like or similar elements, in which:
FIG. 1 illustrates a schematic diagram of an example environment in which embodiments in accordance with the present disclosure may be implemented;
FIGS. 2A-2F illustrate example interfaces according to some embodiments of the present disclosure;
FIG. 3 illustrates a flow chart of an example interaction process, according to some embodiments of the present disclosure;
FIG. 4 illustrates a schematic block diagram of an interaction device according to some embodiments of the present disclosure; and
fig. 5 illustrates a block diagram of an electronic device capable of implementing various embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been illustrated in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather, these embodiments are provided so that this disclosure will be more thorough and complete. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be noted that any section/subsection headings provided herein are not limiting. Various embodiments are described throughout this document, and any type of embodiment may be included under any section/subsection. Furthermore, the embodiments described in any section/subsection may be combined in any manner with any other embodiment described in the same section/subsection and/or in a different section/subsection.
In describing embodiments of the present disclosure, the term "comprising" and its like should be taken to be open-ended, i.e., including, but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The term "some embodiments" should be understood as "at least some embodiments". Other explicit and implicit definitions are also possible below. The terms "first," "second," and the like, may refer to different or the same object. Other explicit and implicit definitions are also possible below.
Embodiments of the present disclosure may relate to user data, the acquisition and/or use of data, and the like. These aspects all follow corresponding legal and related regulations. In embodiments of the present disclosure, all data collection, acquisition, processing, forwarding, use, etc. is performed with knowledge and confirmation by the user. Accordingly, in implementing the embodiments of the present disclosure, the user should be informed of the type of data or information, the range of use, the use scenario, etc. that may be involved and obtain the authorization of the user in an appropriate manner according to the relevant laws and regulations. The particular manner of notification and/or authorization may vary depending on the actual situation and application scenario, and the scope of the present disclosure is not limited in this respect.
In the present description and embodiments, if the personal information processing is concerned, the processing is performed on the premise of having a validity base (for example, obtaining agreement of the personal information body, or being necessary for executing a contract, etc.), and the processing is performed only within a prescribed or contracted range. The user refuses to process the personal information except the necessary information of the basic function, and the basic function is not influenced by the user.
The internet provides a wide variety of interaction approaches for people. For example, people may interact with a host by a live room, and may also interact with other users by way of private trust. However, it is a focus of attention for users to be able to receive a large number of messages, how to effectively ensure that messages are handled efficiently in time.
The embodiment of the disclosure provides an interaction scheme. According to this approach, a target interface associated with the target object may be presented; providing an access entry associated with the virtual object in the target interface in response to determining that the target object is configured with the corresponding virtual object; and presenting an interactive interface with the virtual object based on the selection of the access portal, the interactive interface for supporting multi-modal interactions between the current user and the virtual object.
In this way, the embodiment of the disclosure can support the user to enter the interaction space of the corresponding virtual object through the access entrance provided in the related interface of the target object, and can perform multi-modal interaction with the virtual object through the interaction space, so that the information acquisition efficiency is improved.
Various example implementations of the scheme are described in further detail below in conjunction with the accompanying drawings.
Example Environment
FIG. 1 illustrates a schematic diagram of an example environment 100 in which embodiments of the present disclosure may be implemented. As shown in fig. 1, an example environment 100 may include an electronic device 110.
In this example environment 100, an electronic device 110 may be running an application 120 that supports interface interactions. The application 120 may be any suitable type of application for interface interaction, examples of which may include, but are not limited to: video applications, social applications, or other suitable applications. The user 140 may interact with the application 120 via the electronic device 110 and/or its attached device.
In the environment 100 of fig. 1, if the application 120 is in an active state, the electronic device 110 may present an interface 150 for supporting interface interactions through the application 120.
In some embodiments, the electronic device 110 communicates with the server 130 to enable provisioning of services for the application 120. The electronic device 110 may be any type of mobile terminal, fixed terminal, or portable terminal, including a mobile handset, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, media computer, multimedia tablet, palmtop computer, portable gaming terminal, VR/AR device, personal communication system (Personal Communication System, PCS) device, personal navigation device, personal digital assistant (Personal Digital Assistant, PDA), audio/video player, digital camera/video camera, positioning device, television receiver, radio broadcast receiver, electronic book device, gaming device, or any combination of the preceding, including accessories and peripherals for these devices, or any combination thereof. In some embodiments, electronic device 110 is also capable of supporting any type of interface to the user (such as "wearable" circuitry, etc.).
The server 130 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, a content distribution network, basic cloud computing services such as big data and an artificial intelligence platform. Server 130 may include, for example, a computing system/server, such as a mainframe, edge computing node, computing device in a cloud environment, and so on. The server 130 may provide background services for applications 120 in the electronic device 110 that support virtual scenes.
A communication connection may be established between server 130 and electronic device 110. The communication connection may be established by wired means or wireless means. The communication connection may include, but is not limited to, a bluetooth connection, a mobile network connection, a universal serial bus (Universal Serial Bus, USB) connection, a wireless fidelity (Wireless Fidelity, wiFi) connection, etc., as embodiments of the disclosure are not limited in this respect. In embodiments of the present disclosure, the server 130 and the electronic device 110 may implement signaling interactions through a communication connection therebetween.
It should be understood that the structure and function of the various elements in environment 100 are described for illustrative purposes only and are not meant to suggest any limitation as to the scope of the disclosure.
Some example embodiments of the present disclosure will be described below with continued reference to the accompanying drawings.
Example interactions
An example interaction procedure according to an embodiment of the present disclosure will be described below in conjunction with fig. 2A through 2F. Fig. 2A-2F illustrate example interfaces 200A-200F, according to some embodiments of the present disclosure. The interfaces 200A-200F may be provided by the electronic device 110 as shown in fig. 1.
As shown in fig. 2A, interface 200A may be an interface associated with a target object (e.g., user X). For example, the interface 200A may be a personal home page of the target object.
In some embodiments, it may be determined whether the target object is configured with a corresponding virtual object. As an example, the target object may turn on or off the corresponding virtual object, for example, through a configuration operation. In some scenarios, a virtual object may also be referred to as "split," for example.
As shown in fig. 2A, if it is determined that the target object is configured with a corresponding virtual object, the electronic device 110 may provide an access portal in the interface 200A for association with the virtual object.
In some embodiments, the electronic device 110 may provide the indication element 205 corresponding to the access portal in association with identification information (e.g., image identification or text identification) of the target object. For example, the electronic device 110 may display an indication element 205 around a head-image of a target object that characterizes the target object as configured with a virtual object, and may enter an interactive interface with the virtual object through the indication element 205.
As an example, upon receiving a selection of the indication element 205, the electronic device 110 may display an interface 200B as shown in fig. 2B. Interface 200B may correspond to an interactive interface of a current user with a virtual object.
In some embodiments, interface 200B may support multi-modal interactions between the current user and the virtual object. Such multimodal interactions may include a variety of interactions among: text interactions, voice interactions, video interactions, picture interactions.
As an example, a user may input text content, audio content, video content, picture content, etc., for example, through input controls provided in interface 200B, and may receive reply content from a virtual object. Such reply content may also include text content, audio content, video content, picture content, link content, and so forth.
In some embodiments, the interface 200B may provide an interaction container 215 (also referred to as an interaction space) for supporting multi-modal interactions between a current user and virtual objects. In some embodiments, the participants of the interaction container may include only current users and virtual objects. In this way, embodiments of the present disclosure may provide a current user with a one-to-one space for multimodal interactions with respect to virtual objects.
In some embodiments, the electronic device 110 may also receive a preset operation of the user in the interface 200B, for example, and may switch to the display interface 200C. Interface 200B is also referred to as a first interactive interface and interface 200C is also referred to as a second interactive interface.
As shown in fig. 2C, interface 200C may be a second interactive interface with a second virtual object, where the second virtual object corresponds to another object (e.g., user Y) that is different from the target object. For example, the electronic device 110 may switch to displaying an interactive interface with other virtual objects based on a sliding operation of the user in a predetermined direction received in the interface 200B.
Based on the mode, the embodiment of the disclosure can further facilitate the user to switch to interact with different virtual objects, so that the interaction efficiency is improved.
In some embodiments, electronic device 110 may also provide access portals associated with virtual objects in other suitable manners. With continued reference to fig. 2A, in the case of a personal home page in which the interface 200A is the target object, the electronic device 110 may also present visual elements associated with the virtual object in a background area 210 of the personal home page such that the background area corresponds to the access portal.
As an example, as shown in fig. 2A, the electronic device 110 may display visual images of virtual objects and/or corresponding text information (e.g., personal introduction) and the like in the background area 210. The electronic device 110 may present an interface 200B as shown in fig. 2B, for example, based on a user's trigger for the background area 210 to support multi-modal interactions between the current user and the virtual object.
In some embodiments, the electronic device 110 may also provide access portals in other interfaces associated with the target object.
Fig. 2D illustrates an example interface 200D according to some embodiments of the present disclosure. The interface 200 may correspond, for example, to a viewing interface of the work 235 of the target object (e.g., user X).
Similar to fig. 2A, the electronic device 110 may present the indication element 240, for example, in association with identification information (e.g., an avatar) of the target object. The indication element 240 may indicate that the target object is configured with a corresponding virtual object. Further, the user may also access the interactive interface 200B with the virtual object by clicking on the pointing element 240, for example.
In some embodiments, the electronic device 110 characterizes whether the target object is in a live state, for example, with an indication element associated with the identification information. As an example, in response to the target object being in a live state, the electronic device 110 may modify an indication element (also referred to as a first indication element, e.g., indication element 205 or indication element 240) associated with the virtual object to a second indication element. Unlike the first indication element, the second indication element corresponds to, for example, a second access portal of the live interface of the target object.
Taking as an example the display elements of the portion around the avatar of the target object, if the target object is in a live state, the electronic device 110 may present the display elements of the first style. Conversely, if the target object is not in a live state and is configured with a corresponding virtual object, the electronic device 110 may display the display element of the second style.
In some embodiments, the indication element 205 or the indication element 240 may have a dynamic display style. For example, the indication element 205 or the indication element 240 may have a pattern of a breathing light.
In some embodiments, the electronic device 110 may also present an indication element 245 corresponding to the access portal in association with a textual identification (e.g., a user name) of the target object.
Fig. 2E illustrates an example interface 200E according to some embodiments of the present disclosure. As shown in fig. 2E, interface 200E may correspond to, for example, a viewing interface of work 250 of a target object (e.g., user X).
In some embodiments, as shown in fig. 2E, the electronic device 110 may also display an interface 200E associated with the target object. The electronic device 110 may also present an indication element 255 corresponding to the access portal, for example, in the work playing area of the interface 200E. As yet another example, the electronic device 110 may also display an indication element (not shown in the figure) corresponding to the access portal, for example, in association with identification information (e.g., work name) of the work corresponding to the interface (e.g., interface 200E).
In some embodiments, as shown in fig. 2F, the electronic device 110 may also display an interface 200F associated with the target object. Interface 200F may correspond to a viewing interface for media content 260. The target media content 260 may correspond, for example, to preview content of a virtual object corresponding to a target object (e.g., user X).
For example, electronic device 110 may display visual images of virtual objects, text information, audio information, and the like, for example, in interface 200F. Further, the electronic device 110 may provide an indication element 265 corresponding to the access portal in the interface 200F.
In some embodiments, the target interface for providing access to the portal may be an interface for rendering target media content in the media content stream. As an example, the media content played in interfaces 200D, 200E, and 200F may be from a media content stream.
In some scenarios, the media content stream may also be referred to as a Feed stream, for example. As an example, the media content stream may include a plurality of media content recommended for the current user. Furthermore, the media content presented by the media content stream can also change accordingly based on the switching operation.
For example, interface 200D, interface 200E, and/or interface 200F may also switch to rendering another media content in the media content stream based on a switching operation (e.g., a swipe-up operation or a swipe-down operation) for the media content stream.
In some embodiments, the target media content provided in the media content stream may include a media work associated with the target object. For example, the interface 200D may correspond to a media work published by a target object. As another example, media works associated with the target object may also include media works of the target object collection or praise, for example.
In some embodiments, the target media content provided in the media content stream may include preview content associated with a live event of the target object. For example, the media content stream may present preview content of a live room of the target object. Accordingly, the electronic device 110 may also provide access to virtual objects in association with the preview content.
In some embodiments, the target media content provided in the media content stream may include preview content associated with the virtual object. For example, as discussed with reference to fig. 2F, media content 260 may provide preview content associated with a virtual object, examples of which may include, but are not limited to: visual images of virtual objects, text information, audio information, and the like.
Further, it should be understood that the virtual object may support multi-modal interactions with the current user based on any suitable technique, and the present disclosure is not intended to be limited to the driving technique of the virtual object.
Based on the interaction process discussed above, embodiments of the present disclosure are able to provide an interaction portal with a virtual object in an interface associated with a target object, depending on whether the target object is configured with the corresponding virtual object. In this way, the embodiments of the present disclosure can facilitate other users to perceive that the target object is configured with the virtual object, and can more conveniently initiate multi-modal interactions with the virtual object.
Example procedure
Fig. 3 illustrates a flow chart of an example interaction process 300, according to some embodiments of the present disclosure. The process 300 may be implemented at the electronic device 110. The process 300 is described below with reference to fig. 1.
As shown in fig. 3, at block 310, the electronic device 110 presents a target interface associated with a target object.
In block 320, in response to determining that the target object is configured with a corresponding virtual object, the electronic device 110 provides an access portal associated with the virtual object in the target interface.
At block 330, the electronic device 110 presents an interactive interface with the virtual object based on the selection of the access portal, the interactive interface for supporting multi-modal interactions between the current user and the virtual object.
In some embodiments, the target interface displays first identification information of the target object, and providing an access portal associated with the virtual object in the target interface includes: first identification information associated with the target object, a first indication element corresponding to the access portal is presented.
In some embodiments, the access portal is a first access portal, and the process 300 further comprises: and in response to the target object being in a live state, modifying the first indication element to a second indication element, the second indication element corresponding to a second access portal of the live interface of the target object.
In some embodiments, the first indication element has a dynamic display style.
In some embodiments, the target interface is a viewing interface of a target work associated with the target object, and providing access to the target interface associated with the virtual object includes: presenting a second indication element corresponding to the access entrance in a work playing area in the viewing interface; and/or second identifying information associated with the target work, presenting a third indicating element corresponding to the access portal.
In some embodiments, the target interface comprises a personal home page of the target object, and providing access portals associated with the virtual objects in the target interface comprises: visual elements associated with the virtual object are presented in a background area of the personal home page such that the background area corresponds to the access portal.
In some embodiments, the target interface is for rendering target media content in a media content stream, the media content stream comprising a plurality of media content, the target interface further configured to switch to rendering another media content in the media content stream based on a switching operation for the media content stream.
In some embodiments, the target media content includes one of: a media work associated with the target object; first preview content associated with a live event of a target object; and second preview content associated with the virtual object.
In some embodiments, the multimodal interactions include a plurality of interactions among: text interactions, voice interactions, video interactions, picture interactions.
In some embodiments, the interactive interface is a first interactive interface with a first virtual object, the process 300 further comprising: based on the preset operation received in the first interactive interface, a second interactive interface with a second virtual object is presented, the second virtual object corresponding to another object different from the target object.
In some embodiments, the interactive interface is configured to provide an interactive container for supporting multimodal interactions between a current user and virtual objects, the participants of the interactive container being the current user and the virtual objects.
Example apparatus and apparatus
Embodiments of the present disclosure also provide corresponding apparatus for implementing the above-described methods or processes. Fig. 4 illustrates a schematic block diagram of an interaction device 400, according to some embodiments of the present disclosure. The apparatus 400 may be implemented as or included in the electronic device 110. The various modules/components in apparatus 400 may be implemented in hardware, software, firmware, or any combination thereof.
As shown in fig. 4, the apparatus 400 includes a first rendering module 410 configured to render a target interface associated with a target object; an entry providing module 420 configured to maintain providing an access entry associated with the virtual object in the target interface in response to determining that the target object is configured with the corresponding virtual object; and a second presentation module 430 configured to present an interactive interface with the virtual object based on the selection of the access portal, the interactive interface for supporting multi-modal interactions between the current user and the virtual object.
In some embodiments, the target interface displays first identification information of the target object, and the portal providing module 420 is further configured to: first identification information associated with the target object, a first indication element corresponding to the access portal is presented.
In some embodiments, the access portal is a first access portal, and the apparatus 400 further comprises an element modification module configured to: and in response to the target object being in a live state, modifying the first indication element to a second indication element, the second indication element corresponding to a second access portal of the live interface of the target object.
In some embodiments, the first indication element has a dynamic display style.
In some embodiments, the target interface is a viewing interface of a target work associated with the target object, and the portal providing module 420 is further configured to: presenting a second indication element corresponding to the access entrance in a work playing area in the viewing interface; and/or second identifying information associated with the target work, presenting a third indicating element corresponding to the access portal.
In some embodiments, the target interface includes a personal home page of the target object, and the portal providing module 420 is further configured to: visual elements associated with the virtual object are presented in a background area of the personal home page such that the background area corresponds to the access portal.
In some embodiments, the target interface is for rendering target media content in a media content stream, the media content stream comprising a plurality of media content, the target interface further configured to switch to rendering another media content in the media content stream based on a switching operation for the media content stream.
In some embodiments, the target media content includes one of: a media work associated with the target object; first preview content associated with a live event of a target object; and second preview content associated with the virtual object.
In some embodiments, the multimodal interactions include a plurality of interactions among: text interactions, voice interactions, video interactions, picture interactions.
In some embodiments, the interactive interface is a first interactive interface with a first virtual object, and the apparatus 400 further comprises an interface switching module configured to: based on the preset operation received in the first interactive interface, a second interactive interface with a second virtual object is presented, the second virtual object corresponding to another object different from the target object.
In some embodiments, the interactive interface is configured to provide an interactive container for supporting multimodal interactions between a current user and virtual objects, the participants of the interactive container being the current user and the virtual objects.
Fig. 5 illustrates a block diagram of an electronic device 500 in which one or more embodiments of the disclosure may be implemented. It should be understood that the electronic device 500 shown in fig. 5 is merely exemplary and should not be construed as limiting the functionality and scope of the embodiments described herein. The electronic device 500 shown in fig. 5 may be used to implement the electronic device 110 of fig. 1.
As shown in fig. 5, the electronic device 500 is in the form of a general-purpose electronic device. The components of electronic device 500 may include, but are not limited to, one or more processors or processing units 510, memory 520, storage 530, one or more communication units 540, one or more input devices 550, and one or more output devices 560. The processing unit 510 may be a real or virtual processor and is capable of performing various processes according to programs stored in the memory 520. In a multiprocessor system, multiple processing units execute computer-executable instructions in parallel to improve the parallel processing capabilities of electronic device 500.
Electronic device 500 typically includes multiple computer storage media. Such a medium may be any available media that is accessible by electronic device 500, including, but not limited to, volatile and non-volatile media, removable and non-removable media. The memory 520 may be volatile memory (e.g., registers, cache, random Access Memory (RAM)), non-volatile memory (e.g., read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory), or some combination thereof. Storage device 530 may be a removable or non-removable media and may include machine-readable media such as flash drives, magnetic disks, or any other media that may be capable of storing information and/or data and that may be accessed within electronic device 500.
The electronic device 500 may further include additional removable/non-removable, volatile/nonvolatile storage media. Although not shown in fig. 5, a magnetic disk drive for reading from or writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk may be provided. In these cases, each drive may be connected to a bus (not shown) by one or more data medium interfaces. Memory 520 may include a computer program product 525 having one or more program modules configured to perform the various methods or acts of the various embodiments of the present disclosure.
The communication unit 540 enables communication with other electronic devices through a communication medium. Additionally, the functionality of the components of electronic device 500 may be implemented in a single computing cluster or in multiple computing machines capable of communicating over a communication connection. Thus, the electronic device 500 may operate in a networked environment using logical connections to one or more other servers, a network Personal Computer (PC), or another network node.
The input device 550 may be one or more input devices such as a mouse, keyboard, trackball, etc. The output device 560 may be one or more output devices such as a display, speakers, printer, etc. The electronic device 500 may also communicate with one or more external devices (not shown), such as storage devices, display devices, etc., with one or more devices that enable a user to interact with the electronic device 500, or with any device (e.g., network card, modem, etc.) that enables the electronic device 500 to communicate with one or more other electronic devices, as desired, via the communication unit 540. Such communication may be performed via an input/output (I/O) interface (not shown).
According to an exemplary implementation of the present disclosure, a computer-readable storage medium having stored thereon computer-executable instructions, wherein the computer-executable instructions are executed by a processor to implement the method described above is provided. According to an exemplary implementation of the present disclosure, there is also provided a computer program product tangibly stored on a non-transitory computer-readable medium and comprising computer-executable instructions that are executed by a processor to implement the method described above.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus, devices, and computer program products implemented according to the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various implementations of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of implementations of the present disclosure has been provided for illustrative purposes, is not exhaustive, and is not limited to the implementations disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various implementations described. The terminology used herein was chosen in order to best explain the principles of each implementation, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand each implementation disclosed herein.

Claims (14)

1. An interaction method, comprising:
presenting a target interface associated with the target object;
providing, in response to determining that the target object is configured with a corresponding virtual object, an access portal associated with the virtual object in the target interface; and
based on the selection of the access portal, an interactive interface with the virtual object is presented, the interactive interface being configured to support multi-modal interactions between a current user and the virtual object.
2. The method of claim 1, wherein the target interface displays first identification information of the target object, and providing an access portal associated with the virtual object in the target interface comprises:
and presenting a first indication element corresponding to the access entry in association with the first identification information of the target object.
3. The method of claim 2, wherein the access portal is a first access portal, the method further comprising:
and modifying the first indicating element into a second indicating element in response to the target object being in a live state, wherein the second indicating element corresponds to a second access entry of the live interface of the target object.
4. The method of claim 2, wherein the first indication element has a dynamic display style.
5. The method of claim 1, wherein the target interface is a viewing interface of a target work associated with the target object, and providing an access portal associated with the virtual object in the target interface comprises:
presenting a second indication element corresponding to the access entrance in a work playing area in the viewing interface; and/or
And presenting a third indication element corresponding to the access portal in association with the second identification information of the target work.
6. The method of claim 1, wherein the target interface comprises a personal home page of the target object, and providing access portals associated with the virtual objects in the target interface comprises:
visual elements associated with the virtual object are presented in a background area of the personal home page such that the background area corresponds to the access portal.
7. The method of claim 1, wherein the target interface is for presenting target media content in a media content stream, the media content stream comprising a plurality of items of media content, the target interface further configured to switch to presenting another media content in the media content stream based on a switching operation for the media content stream.
8. The method of claim 7, wherein the target media content comprises one of:
a media work associated with the target object;
first preview content associated with a live event of the target object;
and second preview content associated with the virtual object.
9. The method of claim 1, wherein the multimodal interactions include a plurality of interactions among: text interactions, voice interactions, video interactions, picture interactions.
10. The method of claim 1, wherein the interactive interface is a first interactive interface with a first virtual object, the method further comprising:
based on the preset operation received in the first interactive interface, a second interactive interface with a second virtual object is presented, wherein the second virtual object corresponds to another object different from the target object.
11. The method of claim 1, wherein the interactive interface is configured to provide an interactive container for supporting the multi-modal interactions between the current user and the virtual object, the participants of the interactive container being the current user and the virtual object.
12. An interaction device, comprising:
a first presentation module configured to present a target interface associated with a target object;
an entry providing module configured to maintain an access entry associated with the virtual object in the target interface in response to determining that the target object is configured with a corresponding virtual object; and
and a second presentation module configured to present an interactive interface with the virtual object based on the selection of the access portal, the interactive interface being for supporting multi-modal interactions between a current user and the virtual object.
13. An electronic device, comprising:
at least one processing unit; and
at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit, which when executed by the at least one processing unit, cause the electronic device to perform the method of any one of claims 1 to 11.
14. A computer readable storage medium having stored thereon a computer program executable by a processor to implement the method of any of claims 1 to 11.
CN202311801558.5A 2023-12-25 2023-12-25 Interaction method, device, equipment and storage medium Pending CN117785360A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311801558.5A CN117785360A (en) 2023-12-25 2023-12-25 Interaction method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311801558.5A CN117785360A (en) 2023-12-25 2023-12-25 Interaction method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117785360A true CN117785360A (en) 2024-03-29

Family

ID=90379391

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311801558.5A Pending CN117785360A (en) 2023-12-25 2023-12-25 Interaction method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117785360A (en)

Similar Documents

Publication Publication Date Title
US11314568B2 (en) Message processing method and apparatus, storage medium, and computer device
US20180043264A1 (en) Game Accessing Method and Processing Method, Server, Terminal, and System
US10496354B2 (en) Terminal device, screen sharing method, and screen sharing system
CN109275042B (en) Bullet screen message distribution method, bullet screen message distribution device, terminal and storage medium
CN111104551B (en) Live broadcast room label determining method and device, storage medium and electronic equipment
CN112995721A (en) Video delivery method, delivery method and device of rich media content and storage medium
CN108197210B (en) User interactive reading method based on friend relationship and computing equipment
US20080182558A1 (en) Contents management method and apparatus
CN114422236B (en) Intelligent device access method and device and electronic device
CN117785360A (en) Interaction method, device, equipment and storage medium
CN115510348A (en) Method, apparatus, device and storage medium for content presentation
CN114338897A (en) Object sharing method and device, electronic equipment and storage medium
CN117850937A (en) Interaction method, device, equipment and storage medium
CN117519538A (en) Interface interaction method, device, equipment and storage medium
CN117177021A (en) Method, device, equipment and storage medium for determining interactive resources
CN117354573A (en) Information display method, device, equipment and storage medium
CN116781940A (en) Interface interaction method, device, equipment and storage medium
CN117459775A (en) Interface interaction method, device, equipment and storage medium
CN118042178A (en) Interaction method, device, equipment and storage medium
CN111666483B (en) Application program recommendation method and device, electronic equipment and readable storage medium
CN116991271A (en) Method, apparatus, device and storage medium for information presentation
CN117354570A (en) Information display method, device, equipment and storage medium
CN116546232A (en) Interface interaction method, device, equipment and storage medium
CN116366907A (en) Interface interaction method, device, equipment and storage medium
CN117880549A (en) Interaction method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination