CN113301412A - Information display method, information processing method, device and system - Google Patents
Information display method, information processing method, device and system Download PDFInfo
- Publication number
- CN113301412A CN113301412A CN202010340430.3A CN202010340430A CN113301412A CN 113301412 A CN113301412 A CN 113301412A CN 202010340430 A CN202010340430 A CN 202010340430A CN 113301412 A CN113301412 A CN 113301412A
- Authority
- CN
- China
- Prior art keywords
- video picture
- area
- product
- server
- live broadcast
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 107
- 230000010365 information processing Effects 0.000 title claims abstract description 28
- 238000003672 processing method Methods 0.000 title claims description 16
- 230000000694 effects Effects 0.000 claims abstract description 127
- 230000002452 interceptive effect Effects 0.000 claims description 168
- 230000003993 interaction Effects 0.000 claims description 102
- 238000012545 processing Methods 0.000 claims description 99
- 238000012790 confirmation Methods 0.000 claims description 42
- 230000004044 response Effects 0.000 claims description 28
- 238000012360 testing method Methods 0.000 claims description 18
- 238000004590 computer program Methods 0.000 claims description 16
- 230000001960 triggered effect Effects 0.000 claims description 8
- 230000004927 fusion Effects 0.000 claims description 5
- 241000209140 Triticum Species 0.000 claims description 4
- 235000021307 Triticum Nutrition 0.000 claims description 4
- 230000008569 process Effects 0.000 claims description 4
- 238000004886 process control Methods 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 30
- 238000005111 flow chemistry technique Methods 0.000 description 7
- 239000002537 cosmetic Substances 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000003796 beauty Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 210000001364 upper extremity Anatomy 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2187—Live feed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4788—Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/488—Data services, e.g. news ticker
- H04N21/4882—Data services, e.g. news ticker for displaying messages, e.g. warnings, reminders
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The embodiment of the application provides an information display method, an information processing device and an information processing system. Wherein, a first area is provided in a live interface of a live broadcast room; displaying a first video picture in the first area; the first video picture is acquired by a main broadcasting terminal; providing a second area in the live interface; displaying a second video picture in the second area; the second video picture is obtained by identifying a target object from a third video picture and fusing a product effect of a target product associated with the live broadcast room into the third video picture based on the target object; and the third video picture is acquired by the first user terminal. The technical scheme of the embodiment of the application improves the product popularization effect.
Description
Technical Field
The embodiment of the application relates to the technical field of computer application, in particular to an information display method, an information processing device and an information processing system.
Background
With the development of internet technology and streaming media technology, live webcast is rapidly developed, and watching users are more and more, wherein live webcast refers to the fact that information is synchronously manufactured and distributed on site along with the occurrence and development processes of events, and a bidirectional-circulation information network distribution mode is provided.
Because the live webcasting brings more user traffic, many product providers select a live webcasting mode to introduce related products to users, for example, in an e-commerce scene, the users can be guided to know commodities more and more quickly by combining the live webcasting with an e-commerce platform, and therefore purchase can be completed quickly.
However, the current product introduction is explained by a main broadcasting, and field trial is performed when necessary, so that a user can only know the product through the explanation of the main broadcasting, and the product cannot be actually experienced, and therefore the product popularization effect is not good enough.
Disclosure of Invention
The embodiment of the application provides an information display method, an information processing device and an information processing system, and aims to solve the technical problem that the product popularization effect is not good enough in the prior art.
In a first aspect, an embodiment of the present application provides an information display method, including:
providing a first area in a live interface of a live broadcast room;
displaying a first video picture in the first area; the first video picture is acquired by a main broadcasting terminal;
providing a second area in the live interface; displaying a second video picture in the second area; the second video picture is obtained by identifying a target object from a third video picture and fusing a product effect of a target product associated with the live broadcast room into the third video picture based on the target object; and the third video picture is acquired by the first user terminal.
In a second aspect, an embodiment of the present application provides an information display method, including:
a first user end provides a first area and a first interactive control in a live broadcast interface, and displays a first video picture in the first area;
responding to the triggering operation aiming at the first interactive control, establishing an interactive connection channel with a main broadcasting end through a server end, and providing a second area in the live broadcasting interface;
collecting a third video picture and sending the third video picture to the server;
displaying trial prompt information corresponding to at least one product associated with the live broadcast room in the live broadcast interface;
responding to a trigger operation aiming at any trial prompt information, sending a trial request to a server based on a target product corresponding to the any trial prompt information so that the server can identify a target object corresponding to the target product in the third video picture, and fusing a product effect of the target product into the third video picture based on the target object to obtain a second video picture;
displaying the second video picture in the second area.
In a third aspect, an embodiment of the present application provides an information display method, including:
the anchor terminal provides a first area and second interactive controls corresponding to different users in a live broadcast interface, and displays a first video picture in the first area;
responding to the triggering operation aiming at any second interactive control, establishing an interactive connection channel corresponding to any second interactive control and a first user side through a server side, and providing a second area in the live broadcast interface;
displaying trial prompt information corresponding to at least one product associated with the live broadcast room in the live broadcast interface;
responding to a trigger operation aiming at any trial prompt information, sending a trial request to a server based on a target product corresponding to the any trial prompt information so that the server can identify a target object corresponding to the target product in a third video picture acquired by a first user side, and fusing a product effect of the target product into the third video picture based on the target object to obtain a second video picture;
displaying the second video picture in the second area.
In a fourth aspect, an embodiment of the present application provides an information display method, including:
the second user side obtains a first video picture and a second video picture provided by the server side; the first video picture is acquired by a main broadcasting terminal; the second video picture is obtained by identifying a target object from a third video picture and fusing a product effect of a target product associated with the live broadcast room into the third video picture based on the target object; the third video picture is acquired by the first user terminal;
providing a first area and a second area in a live broadcast interface;
displaying the first video picture in the first area and displaying the second video picture in the second area.
In a fifth aspect, an embodiment of the present application provides an information processing method, including:
establishing an interactive connection channel between a first user terminal and a main broadcasting terminal;
acquiring a first video picture acquired by the anchor terminal;
acquiring a third video picture acquired by the first user side, and identifying a target object in the third video picture;
based on the target object, fusing the product effect of the target product associated with the live broadcast room into the third video picture to obtain a second video picture;
and providing the first video picture and the second video picture to a second user end so as to enable the second user end to display the first video picture and the second video picture.
In a sixth aspect, an embodiment of the present application provides an information display method, including:
a first user end provides a first area and a microphone connecting control in a live broadcast interface, and displays a first video picture in the first area;
responding to the triggering operation aiming at the microphone connecting control, establishing a microphone connecting interaction channel with a main broadcasting end through a server end, and providing a second area in the live broadcasting interface;
collecting a third video picture and sending the third video picture to the server;
displaying makeup test prompt information corresponding to at least one makeup product associated with the live broadcast room in the live broadcast interface;
responding to a trigger operation aiming at any makeup trying prompt message, sending a makeup trying request to a server based on a target makeup product corresponding to any makeup trying prompt message so that the server can identify a face feature part corresponding to the target makeup product in the third video picture, and fusing a makeup effect image of the target makeup product to the third video picture based on the face feature part to obtain a makeup trying picture;
displaying the makeup trial picture in the second area.
In a seventh aspect, an embodiment of the present application provides an information display method, including:
the method comprises the steps that a main broadcast end provides a first area and microphone connecting controls corresponding to different users in a live broadcast interface, and a first video picture is displayed in the first area;
responding to the triggering operation aiming at any one microphone connecting control, establishing a microphone connecting interaction channel corresponding to the first user side of the any microphone connecting control through a server side, and providing a second area in the live broadcast interface;
displaying makeup test prompt information corresponding to at least one makeup product associated with the live broadcast room in the live broadcast interface;
responding to a trigger operation aiming at any makeup test prompt message, sending a makeup test request to a server based on a target makeup product corresponding to any makeup test prompt message so that the server can identify a face feature part corresponding to the target makeup product in a third video picture acquired by a first user side, and fusing a makeup effect image of the target makeup product into the third video picture based on the face feature part to obtain a makeup test picture;
displaying the makeup trial picture in the second area.
In an eighth aspect, an embodiment of the present application provides an information display method, including:
the second user side obtains a first video picture and a makeup trial picture provided by the server side; the first video picture is acquired by a main broadcasting terminal; the makeup trial picture is obtained by identifying a face characteristic part from a third video picture and fusing a makeup effect picture of a target makeup product associated with the live broadcast room into the third video picture based on the face characteristic part; the third video picture is acquired by the first user terminal;
providing a first area and a second area in a live broadcast interface;
displaying the first video picture in the first area, and displaying the makeup trial picture in the second area.
In a ninth aspect, an embodiment of the present application provides an information processing method, including:
establishing a connecting wheat interaction channel between a first user terminal and a main broadcasting terminal;
acquiring a first video picture acquired by the anchor terminal;
acquiring a third video image acquired by the first user side, and identifying a face characteristic part in the third video image;
based on the face feature part, fusing a makeup effect picture of a target makeup product associated with a live broadcast room into the third video picture to obtain a makeup trial picture;
and providing the first video picture and the makeup trial picture to a second user end so that the second user end can display the first video picture and the makeup trial picture.
In a tenth aspect, an embodiment of the present application provides a live webcast system, which includes a server, an anchor, a first user, and a second user;
the server is used for establishing an interactive connection channel between the anchor terminal and the first user terminal; acquiring a first video picture acquired by the anchor terminal; acquiring a third video picture acquired by the first user side, and identifying a target object in the third video picture; based on the target object, fusing the product effect of the target product associated with the live broadcast room into the third video picture to obtain a second video picture;
the first user side is used for acquiring a third video picture after an interactive connection channel with the anchor side is established, and sending the third video picture to the server side; acquiring the first video picture and the second video picture from the server side, and displaying the first video picture and the second video picture;
the second user side is used for acquiring the first video picture and the second video picture from the server side and displaying the first video picture and the second video picture.
In an eleventh aspect, an embodiment of the present application provides an information display device, including:
the first display module is used for providing a first area in a live interface of a live broadcast room; displaying a first video picture in the first area; the first video picture is acquired by a main broadcasting terminal;
the second display module is used for providing a second area in the live broadcast interface; displaying a second video picture in the second area; the second video picture is obtained by identifying a target object from a third video picture and fusing a product effect of a target product associated with the live broadcast room into the third video picture based on the target object; and the third video picture is acquired by the first user terminal.
In a twelfth aspect, an embodiment of the present application provides an information display device, including:
the third display module is used for providing a first area and a first interactive control in a live broadcast interface and displaying a first video picture in the first area;
the first response module is used for responding to the triggering operation aiming at the first interactive control, establishing an interactive connection channel with a main broadcast end through a server end, and providing a second area in the live broadcast interface;
the first acquisition module is used for acquiring a third video picture and sending the third video picture to the server;
the fourth display module is used for displaying trial prompt information corresponding to at least one product associated with the live broadcast room in the live broadcast interface; responding to a trigger operation aiming at any trial prompt information, sending a trial request to a server based on a target product corresponding to the any trial prompt information so that the server can identify a target object corresponding to the target product in the third video picture, and fusing a product effect of the target product into the third video picture based on the target object to obtain a second video picture; displaying the second video picture in the second area.
In a thirteenth aspect, an embodiment of the present application provides an information display apparatus, including:
the fifth display module is used for providing a first area and second interactive controls corresponding to different users in a live broadcast interface and displaying a first video picture in the first area;
the second response module is used for responding to the triggering operation aiming at any second interactive control, establishing an interactive connection channel corresponding to a first user end of any second interactive control through the server end, and providing a second area in the live broadcast interface;
the sixth display module is used for displaying trial prompt information corresponding to at least one product associated with the live broadcast room in the live broadcast interface; responding to a trigger operation aiming at any trial prompt information, sending a trial request to a server based on a target product corresponding to the any trial prompt information so that the server can identify a target object corresponding to the target product in a third video picture acquired by a first user side, and fusing a product effect of the target product into the third video picture based on the target object to obtain a second video picture; displaying the second video picture in the second area.
In a fourteenth aspect, an embodiment of the present application provides an information display apparatus, including:
the first acquisition module is used for acquiring a first video picture and a second video picture provided by the server; the first video picture is acquired by a main broadcasting terminal; the second video picture is obtained by identifying a target object from a third video picture and fusing a product effect of a target product associated with the live broadcast room into the third video picture based on the target object; the third video picture is acquired by the first user terminal;
the seventh display module is used for providing a first area and a second area in a live broadcast interface; displaying the first video picture in the first area and displaying the second video picture in the second area.
In a fifteenth aspect, an embodiment of the present application provides an information processing apparatus, including:
the interactive triggering module is used for establishing an interactive connection channel between the first user terminal and the anchor terminal;
the second acquisition module is used for acquiring a first video picture acquired by the anchor terminal; acquiring a third video picture acquired by the first user side;
the identification module is used for identifying a target object in the third video picture;
the fusion module is used for fusing the product effect of the target product associated with the live broadcast room into the third video picture to obtain a second video picture based on the target object;
and the providing module is used for providing the first video picture and the second video picture to a second user end so as to enable the second user end to display the first video picture and the second video picture.
In a sixteenth aspect, an embodiment of the present application provides a live interface, including a first area and a second area;
the first area is used for displaying a first video picture; the first video picture is acquired by a main broadcasting terminal;
the second area is used for displaying a second video picture; the second video picture is obtained by identifying a target object from a third video picture and fusing a product effect of a target product associated with the live broadcast room into the third video picture based on the target object position; and the third video picture is acquired by the first user terminal.
In a seventeenth aspect, an embodiment of the present application provides an electronic device, including a storage component, a display component, and a processing component; the storage component stores one or more computer program instructions; the one or more computer program instructions are for being invoked and executed by the processing component to implement the information display method of the first aspect.
In an eighteenth aspect, embodiments of the present application provide a computing device, comprising a processing component and a storage component;
the storage component stores one or more computer instructions; the one or more computer instructions are called and executed by the processing component to implement the information processing method according to the fifth aspect.
In a nineteenth aspect, an embodiment of the present application provides a computer storage medium storing a computer program, where the computer program, when executed by a computer, implements the information display method according to the first aspect.
A twentieth aspect of the present invention provides a computer storage medium storing a computer program that, when executed by a computer, implements the information processing method according to the fifth aspect.
In the embodiment of the application, the anchor terminal and the user terminal can establish an interactive connection channel, so that a first area and a second area can be provided in a live broadcast interface, the first area is used for displaying a first video picture acquired by the anchor terminal, the second area is used for displaying a second video picture, the second video picture is generated based on a third video picture acquired by the user terminal, and the second video picture is obtained by identifying a target object in the third video picture and fusing the product effect of a target product associated with the live broadcast room into the third video picture based on the target object, so that the second video picture realizes the virtual-real combined display effect and realizes the purpose of virtually trying the target product, so that a user can watch not only the anchor explanation, but also the virtual trial effect of the product, thereby improving the experience of the user and being beneficial to improving the popularization effect of the product, and effective information recommendation is realized.
These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic structural diagram illustrating an embodiment of a live webcasting system provided in the present application;
FIG. 2 is a flow chart illustrating one embodiment of an information display method provided herein;
fig. 3a to fig. 3c respectively show display diagrams of a live interface in an actual application of the embodiment of the present application;
FIG. 4 is a flow chart illustrating a further embodiment of an information display method provided by the present application;
fig. 5a to 5f respectively show display diagrams of a live interface in an actual application of the embodiment of the present application;
FIG. 6 is a flow chart illustrating a further embodiment of an information display method provided by the present application;
FIG. 7 is a flow chart illustrating a further embodiment of an information display method provided by the present application;
FIG. 8 is a schematic structural diagram illustrating an embodiment of an information processing method provided by the present application;
FIG. 9 is a schematic diagram illustrating an embodiment of an information display device provided by the present application;
FIG. 10 is a schematic diagram illustrating a structure of another embodiment of an information display device provided by the present application;
FIG. 11 is a schematic diagram illustrating a structure of another embodiment of an information display device provided by the present application;
FIG. 12 is a schematic diagram illustrating a structure of another embodiment of an information display device provided by the present application;
FIG. 13 is a schematic diagram illustrating an embodiment of an electronic device provided by the present application;
FIG. 14 is a schematic diagram illustrating an embodiment of an information processing apparatus provided by the present application;
FIG. 15 illustrates a schematic diagram of one embodiment of a computing device provided herein.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
In some of the flows described in the specification and claims of this application and in the above-described figures, a number of operations are included that occur in a particular order, but it should be clearly understood that these operations may be performed out of order or in parallel as they occur herein, the number of operations, e.g., 101, 102, etc., merely being used to distinguish between various operations, and the number itself does not represent any order of performance. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
The technical scheme of the embodiment of the application is mainly applied to a live webcast scene for popularizing products in a live broadcast mode, for example, an E-commerce live broadcast scene for recommending commodities in a live broadcast mode.
Because the network live broadcast can bring more user flow, many product providers select a network live broadcast mode to introduce products to users, in a network live broadcast scene, the live broadcast contents of different anchor broadcasts are usually distinguished in a live broadcast room mode, and users can watch the products recommended by the anchor broadcasts of the live broadcast room by entering a certain live broadcast room through user terminals. The anchor is when live the on-the-spot explanation product, in order to understand the product for convenience, can try the product when necessary, for example, when the product is the cosmetics, can try to make up, when the product is the clothing, can try on, when the product is the accessory, can try on etc., the user can only rely on the anchor explanation or the anchor effect of trying to know the product, still do not have actual impression to the product, consequently, it is not good to lead to the product to promote the effect, and then influence the product transformation rate, if under the live scene of electricity merchant, the commodity of anchor explanation can't effectively attract the user, the user just can not leave the order and buy, influence the purchase rate of commodity.
In order to effectively improve the product popularization effect, the inventor proposes the technical scheme of the present application through a series of researches, and the technical scheme in the embodiment of the present application will be clearly and completely described below with reference to the drawings in the embodiment of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The technical solution in this embodiment of the present application can be applied to a live network system as shown in fig. 1, where the live network system mainly includes an anchor 101, a server 102, and a client, where the client may include a first client 103 and a second client 104, and it can be understood that in a live network scenario, the live network system may have at least one first client 103 and at least one second client 104. And the offline user registers the user account through the user side, so that the live video can be watched.
It should be noted that the anchor terminal and the user terminal shown in fig. 1 may be implemented in a mobile phone, a tablet computer, or other electronic devices with a video capture function in practical applications, and are not limited to the device shapes shown in fig. 1.
In the current implementation scheme, the anchor terminal is responsible for acquiring the sound and/or the picture of the live broadcast site where the anchor is located in real time, acquiring the live broadcast content and uploading the live broadcast content to the server terminal, wherein the live broadcast content can include video pictures, audio data and the like. Live broadcast contents of different anchor broadcasts can be distinguished through a live broadcast room, and the anchor broadcasts can firstly apply for the live broadcast room from a server through an anchor broadcast end to record and upload the live broadcast contents in real time. The user can request to enter a certain live broadcast room through the user side, the server side can send live broadcast contents of the live broadcast room to the user side, the user side can play the live broadcast contents, and the user side can display video pictures in the live broadcast contents in a live broadcast interface of the live broadcast room.
The anchor can explain one or more products needing to be recommended in a live broadcast site, and a user can obtain relevant information of the products and the like from live broadcast content played by a user side.
In this embodiment of the application, as shown in fig. 1, the server 102 may establish an interactive connection channel between the anchor 101 and the first user 103, where the anchor 101 is configured to collect a first video picture; after the first user terminal 103 establishes an interactive connection channel with the anchor terminal 101, a third video picture can be collected and sent to the server terminal 102; the server 102 can identify a feature of a target object in the third video picture, and based on the feature, fuse a product effect graph of a target product associated with the live broadcast room into the third video picture to obtain a second video picture;
the server 102 may provide the first video frame and the second video frame to the second user 104, so that the second user 104 may display the first video frame and the second video frame in the live interface.
Of course, the server 102 may also provide the first video frame and the second video frame to the first user terminal 103, and the first user terminal 103 may display the first video frame and the second video frame in the live interface.
The server 102 may also provide the first video frame and the second video frame to the anchor terminal 101, or only provide the second video frame to the anchor terminal 101, so that the anchor terminal 101 can display the first video frame and the second video frame in the live interface.
In the embodiment of the application, the first video picture and the second video picture can be displayed in the live interface, and the second video picture is the live picture fusing the third video picture with the product effect picture, so that the anchor explanation can be watched in the live interface, the virtual trial effect of the product can be watched, the user experience feeling is improved, and the product popularization effect is improved.
As can be seen from the above description, the first user end 103 and the second user end 104 are different only in that the first user end 103 is a user end that establishes an interactive connection channel with the anchor end 101 to implement a multi-user video live broadcast wheat-connecting scene, and not only can view a video picture of a site where the anchor end is located from a live broadcast interface, but also can view a video picture of the site where the first user end is located, and certainly, the first user end can also collect audio data and the like of the site where the first user end is located for interactive transmission, which is not specifically limited in the present application.
The user side can be configured in an electronic device such as a mobile phone, a tablet computer, a computer, and a smart watch, the server side can be implemented by a Content Delivery Network (CDN) system or other processing systems, the anchor side can be composed of an electronic device having an acquisition function and an Open Broadcast Software (OBS) streaming function, for example, an intelligent device such as a mobile phone and a tablet with a camera, and the first user side can also be composed of an electronic device having an acquisition function and an OBS streaming function.
Of course, the present application is not limited to the implementation of live webcasting by using the above live webcasting technical solution. In addition, as can be understood by those skilled in the art, the live content may need to be processed by encoding, transcoding, compressing, and the like before being uploaded to the server, and correspondingly, the client may need to be processed by decoding, decompressing, and the like before playing the live data, and the like, which is the same as that in the prior art and is not described again.
The user side and the anchor side can be independent application programs, and can also be functional modules integrated in other application programs and the like.
Based on the network live broadcast system shown in fig. 1, as shown in fig. 2, an embodiment of an information display method provided in the embodiment of the present application is a flowchart, and the method may include the following steps:
201: a first zone is provided in a live interface of a live room.
The solution of the embodiment shown in fig. 2 may be performed by the anchor side, the first user side or the second user side.
202: a first video picture is displayed in the first area.
The first video picture is acquired by a main broadcasting end, the main broadcasting end can directly broadcast pictures in a live broadcasting site, sound can be acquired in real time, live broadcasting content of a live broadcasting room is generated, the live broadcasting content can be uploaded to a server, and the first video picture is also contained in the live broadcasting content.
After entering the live broadcasting room, the first user end and the second user end can obtain the live broadcasting content from the server end and play the live broadcasting content, wherein the live broadcasting content comprises the first video picture, namely, a first area is provided in a live broadcasting interface to display the first video picture.
And the anchor terminal can also provide a live interface and display the first video acquired by acquisition in a first area in the live interface.
203: a second region is provided in the live interface.
204: and displaying the second video picture in the second area.
The second video picture is obtained by identifying a target object from the third video picture and fusing a product effect of a target product associated with the live broadcast room into the third video picture based on the target object; and the third video picture is acquired by the first user terminal. The first user end and the anchor end can establish an interactive connection channel, so that the first user end is triggered to acquire a third video picture.
As an optional manner, fusing the product effect of the target product to the third video frame may be obtained by performing corresponding image processing on the target object in the third video frame based on the product effect.
Alternatively, the product effect of the target product can be represented by a product effect diagram. Therefore, the second video picture may specifically be obtained by identifying a target object from the third video picture and fusing a product effect graph of a target product associated with the live broadcast into the third video picture based on the target object, and optionally, the third video picture may specifically be obtained by identifying a target object associated with the target product.
The second video frame may be obtained by fusing a product effect diagram of the target product associated with the live broadcast room to the third video frame based on the characteristic portion of the target object.
Alternatively, the characteristic portion of the target object may be determined based on a user selection operation.
Alternatively, the feature portion of the target object corresponding to the target product may be identified from the third video image in combination with the target product. Therefore, the second video frame can be obtained by identifying a characteristic part of the target object from the third video frame and fusing a product effect graph of the target product associated with the live broadcast room into the third video frame based on the characteristic part.
The live broadcast room can be associated with at least one product, the at least one product can be published in advance by a main broadcast, the main broadcast can provide the at least one product when applying for the live broadcast room, and the server can establish association between the at least one product and the live broadcast room. Of course at least one product associated with the live room may also consist of products that are taught by the anchor history as well as products that are currently taught.
Wherein the target product may be a product currently taught by the anchor or any one of the products selected by the anchor or the first user from the at least one product.
The target object may refer to a trial object corresponding to the target product, and the target object feature part may refer to a trial part of the target product corresponding to the target object, for example, if the target product is a product applied to a human body, the target object may refer to a human body, and the feature part may be determined in combination with the trial parts corresponding to different target products, and if the target product is lipstick, the feature part is lips; when the target product is foundation liquid, the characteristic part is the face; when the target product is the jacket, the characteristic part is the upper limb part; when the target product is a hat, the characteristic part is a head and the like. Of course, the target product may also be a product suitable for other objects, for example, when the target product is a television, the target object may refer to a wall or a television cabinet, and the target object feature part may refer to a blank area in the wall or a free area in the television cabinet. If the watching user corresponding to the first user is assumed to be the first user, and the target object is a human body, the third video picture may be obtained by the first user by operating the acquisition device corresponding to the first user to shoot the user, or may be obtained by operating the acquisition device corresponding to the first user to shoot other users on the site where the user is located, and if the target object is a non-human body article, the first user may be obtained by operating the acquisition device corresponding to the first user to shoot the article. Therefore, the target object can be identified from the third video picture, and the characteristic part corresponding to the target product is determined.
The product effect map may be generated based on a product form of the target product in an actual use state, for example, when the target product is a cosmetic product such as lipstick, the product effect map may be a lip effect map having a lipstick color, the target object feature part is a lip part, and the lip effect map is mapped to the lip part of the target object, so that a lip makeup effect of the target object with the lipstick color may be formed. Because the target object is a real shooting object and the product effect graph is virtual content, the product effect graph is superposed on the target object to form a virtual combined display effect, so that a user can feel more intuitive, and the user experience is improved just like actually using a target product.
The first user side can establish an interactive connection channel with the anchor side through the server side, after the interactive connection channel with the anchor side is established, the first user side can acquire pictures of a site where the first user side is located, sound and the like can be included to form interactive content, and the third video picture is included in the interactive content.
The operation of identifying the characteristic part of the target object in the third video picture and fusing the product effect graph to obtain the second video picture can be executed by the first user side or the server side, when the operation is executed by the first user side, the first user side can send the second video picture to the server side, and the server side can provide the second video picture to the first user side and the anchor side; when executed by the server, the server can provide the second video picture to the first user side, the second user side and the main broadcasting side respectively.
After the first user side, the second user side or the main broadcasting side obtains the second video picture, the second video picture can be displayed in a second area in the live broadcasting interface.
The interactive content acquired by the first user terminal can also be sent to the server terminal in real time, and the server terminal can provide the interactive content for the anchor terminal and the second user terminal. Therefore, after the first user side, the second user side or the main broadcasting side obtains the third video picture, the second area can be provided in the live broadcasting interface, and the third video picture can be displayed firstly before the second video picture is displayed in the second area.
The providing of the second area in the live interface may be providing the second area in the live interface, and displaying the second area in the first area in an overlapping manner, or displaying the first area in the second area in an overlapping manner. Of course, the first area and the second area may also be displayed in a split manner in the live broadcast interface, and the first area and the second area are not covered with each other, which is not limited in this application.
It should be noted that, as will be understood by those skilled in the art, when an interactive connection channel is established between the anchor terminal and the first user terminal, because two paths of data acquired by the anchor terminal and the first user terminal in real time exist, the first video picture and the second video picture may be transmitted after data mixing is performed first, and after the anchor terminal, the first user terminal, or the second user terminal obtains the mixed flow picture, the first area and the second area may be provided in the live interface at the same time, and the first video picture in the mixed flow picture may be continuously displayed in the first area, and the second video picture in the mixed flow picture may be displayed in the second area.
Similarly, the first video picture and the third video picture may be mixed and then transmitted, after the mixed flow picture is obtained by the main broadcast terminal, the first user terminal or the second user terminal, the first region and the second region may be provided in the live broadcast interface at the same time, the first video picture in the mixed flow picture may be continuously displayed in the first region, and the third video picture in the mixed flow picture may be displayed in the second region.
In practical application, the data mixed flow processing can be carried out by the anchor terminal, and then the mixed flow picture is sent to the server terminal and provided to the second user terminal or the first user terminal by the server terminal; or the data mixed flow processing is carried out by the main broadcasting terminal, then the mixed flow picture is sent to the server terminal and provided to the second user terminal by the server terminal, and the first user terminal can independently carry out the data mixed flow processing to obtain the mixed flow picture; or the mixed flow picture is obtained after the data mixed flow processing is independently carried out by the main broadcasting end, the first user end or the second user end respectively. The implementation of this technology is not specifically limited, and this application only takes a video image as an example for description, and for audio data respectively acquired by the anchor terminal and the first user terminal, it is also possible to perform data mixing and then transmission, which is the same as the currently common live broadcast wheat-connecting technology, and is not described herein again.
In this embodiment, can show first video picture and second video picture in the live interface, the second video picture is the live picture that fuses the product effect picture with the third video picture for not only can watch the anchor explanation in the live interface, can also watch the virtual effect of trying of product, promoted user experience impression, thereby help improving the product popularization effect.
For convenience of understanding, as shown in fig. 3a to fig. 3c, respectively show possible interface display diagrams of a live interface in an actual application, fig. 3a shows a live interface diagram in a case that an interactive connection channel is not established between a main broadcast end and a first user end, a live interface 300 includes a first area 301, the first area 301 may fill the whole live interface, and is used for displaying a first video picture acquired by the main broadcast end in the first area 301.
Fig. 3b shows a schematic view of a live interface when the anchor terminal establishes an interactive connection channel with the first user terminal, where the live interface may simultaneously provide a first area 301 and a second area 302, and optionally, the first area 301 may be displayed in a superimposed manner in the second area 302, covering a part of the second area, and in order not to affect the display effect of the second area 302, the first area 301 may be displayed in a superimposed manner in a boundary area of the second area 302.
The first area 301 continues to display the first video picture, and after the first user terminal establishes the interactive connection channel with the anchor terminal, a third video picture can be acquired in real time and can be displayed first in the second area 302.
Then, based on the product effect map of the target product, the target object and the feature portion thereof corresponding to the target product may be identified from the third video picture, so that the product effect map may be fused with the third video picture to obtain a second video picture, and thus the second video picture may be displayed in the second area 302, as shown in fig. 3 c.
It should be noted that the live interface may not only provide the first area and the second area, but also may necessarily include some other contents, which may be different according to different terminals (the anchor terminal, the first user terminal, and the second user terminal) corresponding to the live interface, for example, the live interface may further include a pop-up screen display area, a pop-up screen sending control, a forwarding control, a collection control, a praise control, an anchor related introduction content, or a red packet pickup control that may be provided in an e-commerce live scene, and other contents that are promoted in combination with actual scene requirements, and the like.
In some embodiments, after the second video picture is displayed in the second area, the method may further include:
displaying an operation control corresponding to a target product in a live broadcast interface;
and responding to the trigger operation aiming at the operation control, and performing corresponding processing on the target product.
In practical application, the operation controls corresponding to the target product can be displayed only in the live broadcast interfaces corresponding to the first user side and the second user side, so that a user can conveniently execute trigger operation; of course, in some implementations, in order to reduce the processing amount, the display contents in the live interfaces corresponding to the anchor, the first user, and the second user may also be the same.
Optionally, the operation control may include a processing control corresponding to at least one processing type; responding to the triggering operation for the operation control, performing corresponding processing on the target product may include: and responding to the triggering operation of the processing control aiming at any processing type, and performing corresponding processing on the target product according to the processing type.
In an e-commerce live broadcast scene, a target product is a commodity capable of conducting transaction, at least one processing control of a processing type can comprise a transaction control and/or a purchase adding control and the like, corresponding processing which can be triggered by the transaction control is ordering operation, the purchase adding control is also a control which requests to add to a shopping cart, and corresponding processing which is triggered by the purchase adding control can be operation of adding a product to the shopping cart.
Thus, in one implementation, the processing controls may include a buy-in control; in response to the triggering operation of the processing control for any processing type, performing corresponding processing on the target product according to the processing type may include:
and responding to the triggering operation of the purchase adding control, and adding the target product to the shopping cart.
In another implementation, the processing control includes a transaction control; in response to the triggering operation of the processing control for any processing type, performing corresponding processing on the target product according to the processing type may include:
and generating an ordering request based on the target product in response to the triggering operation aiming at the transaction control.
In addition, the operation control can also comprise a property selection control for providing a product property option so as to facilitate the user to select the required product property, and the product property can comprise the quantity, the specification size and the like. (ii) a In response to the triggering operation of the processing control for any processing type, performing corresponding processing on the target product according to the processing type may include:
and responding to the selection operation of the product attribute options provided in the attribute selection control and the trigger operation of the processing control of any processing type, and performing corresponding processing on the target product with the selected product attributes according to the processing type.
For example, the target product having the selected product attribute may be added to a shopping cart, or an order placement request may be generated based on the target product having the selected product attribute, or the like.
It can be understood that the target product generally corresponds to an actual physical product, and the target product added to the shopping cart described herein refers to a virtual form of the target product in the network environment based on the target product, and the target product related in the order placing request generated based on the target product refers to a virtual form of the target product in the network environment, and after the order placing is successful, the actual physical product is delivered to the receiving address in a logistics form, which is a common network transaction mode and is not described herein again.
In the embodiment of the application, the operation control aiming at the target product is displayed in the live interface, so that a user can directly trigger and execute the processing operation aiming at the target product in the live interface without jumping to a product description page, the processing convenience is improved, and the watching of live content is not influenced. The corresponding watching users of the first user side and the second user side can search the main broadcasting explanation content in the live broadcasting room, the virtual trial effect of the product can be checked, the live broadcasting attraction of the product can be further improved, the user interest is improved, the product conversion rate is favorably improved, the product processing operation can be directly started on a live broadcasting interface, and the portable processing mode is further favorable for improving the product conversion rate.
In addition, for the trigger operation of the operation control, a product description page corresponding to the target product can also be displayed. Therefore, in some embodiments, in response to the triggering operation for the operation control, performing corresponding processing on the target product may include:
and responding to the triggering operation aiming at the operation control, and displaying the target product into a corresponding product description page.
Optionally, the product description page can be jumped to from the live interface to display the product description page.
As can be seen from the foregoing description, the technical solution of the embodiment shown in fig. 2 may be executed by the anchor terminal, the first user terminal, or the second user terminal, so that the display of the live interface in the anchor terminal, the first user terminal, or the second user terminal may be implemented.
In one implementation, the technical solution of the embodiment shown in fig. 2 may be executed by the first user end, and therefore, in some embodiments, the method may further include:
providing a first interactive control in a live interface;
responding to the triggering operation aiming at the first interactive control, and establishing an interactive connection channel with the anchor terminal through the server terminal;
and collecting a third video picture and sending the third video picture to the server.
Optionally, the providing the second area in the live interface may be providing the second area in the live interface after the first user establishes an interactive connection channel with the anchor through the server.
Alternatively, the third video picture may be displayed first before the second video picture is displayed in the second area.
And after obtaining the third video picture, the server can provide the third video picture to the anchor terminal and the second user terminal, and simultaneously the server can trigger the live interface of the anchor terminal or the second user terminal to provide a second area, so that the anchor terminal or the second user terminal can display the third video picture in the second area of the respective displayed live interface.
The method comprises the following steps that in response to a triggering operation aiming at a first interactive control, an interactive connection channel established with a main broadcasting end through a server end can be executed according to the following mode:
and responding to the triggering operation of the first user for the first interaction control, sending a first interaction request to the server side so that the server side sends first interaction prompt information to the anchor side, and establishing an interaction connection channel between the first user side and the anchor side based on a first confirmation request fed back by the anchor side.
Optionally, the server receives the first confirmation request fed back by the anchor, and may feed back the first interaction instruction to the first user after the interactive connection channel between the first user and the anchor is established.
The first user terminal can respond to the first interaction instruction, provide the second area in the live broadcast interface, acquire the third video picture, send the third video picture to the server terminal and the like.
In addition, in some embodiments, the anchor terminal may also actively initiate an interaction request with the first user terminal, a second interaction control corresponding to a different user may be provided in a live interface of the anchor terminal, the anchor terminal may send the second interaction request to the server terminal in response to a trigger operation of the anchor terminal for any one of the second interaction controls, the server terminal may send second interaction prompt information to the first user terminal corresponding to the any one of the second interaction controls, and an interaction connection channel between the first user terminal and the anchor terminal is established based on a second confirmation request fed back by the first user terminal.
Therefore, in some embodiments, after the first user terminal provides the first area in the live interface and displays the first video frame in the first area, the method may further include:
the first user terminal displays second interaction prompt information in a live broadcast interface; the second interaction prompt message is sent by the server side after receiving a second interaction request of the anchor side;
responding to the confirmation operation aiming at the second interaction prompt message, and sending a second confirmation request to the server so that the server establishes an interaction connection channel between the first user terminal and the anchor terminal based on the second confirmation request;
and acquiring a third video picture, sending the third video picture to the server, so that the server can identify a target object characteristic part corresponding to the target product from the third video picture based on a trial request which is sent by the anchor terminal and aims at the target product associated with the live broadcast room, and fusing a product effect graph of the target product into the third video picture based on the characteristic part to obtain a second video picture.
Optionally, after the interactive connection channel between the first user side and the anchor side, the server side may feed back a second interactive instruction to the first user side, and the first user side may provide a second area in the live interface, and acquire a third video frame, and the like, in response to the second interactive instruction.
As an alternative, before displaying the second video picture in the second area, the method may further include:
displaying trial prompt information corresponding to at least one product associated with a live broadcast room in a live broadcast interface;
and responding to the triggering operation aiming at any trial prompt information, sending a trial request to the server based on the target product corresponding to any trial prompt information so that the server can identify the target object characteristic part corresponding to the target product in the third video picture, and fusing the product effect graph of the target product to the third video picture based on the characteristic part to obtain the second video picture.
In order to facilitate the user to select the target product, trial prompt information corresponding to at least one product associated with the live broadcast room can be provided in the live broadcast interface for the user to select, and the corresponding target product can be determined based on any trial prompt information selected by the user.
The trial prompt information can be used for prompting the user to perform virtual trial of the product and the like.
After obtaining the second video picture, the server may provide the second video picture to the first user side, the second user side, and the anchor side, respectively. Therefore, the first user terminal can acquire the second video picture from the server terminal and display the second video picture in the second area.
As another alternative, before displaying the second video picture in the second area, the method further comprises:
displaying trial prompt information corresponding to at least one product associated with the live broadcast room in a live broadcast interface;
responding to the trigger operation aiming at any trial prompt information, and acquiring a product effect graph of a target product from the server based on the target product corresponding to any trial prompt information;
and fusing the product effect picture into a third video picture to obtain a second video picture.
That is, the second video picture may be obtained by the first user side through fusion, optionally, after the first user side obtains the second video picture, the second video picture may be sent to the server side, and the server side may provide the second video picture to the anchor side or the second user side.
In some embodiments, the server may be a second client that provides the second video frame to a second client that meets the predetermined requirements.
The predetermined requirement may be that the user account is a predetermined account, the predetermined account may be a high-level account, and in practical application, the high-level account may be obtained by payment or may be a user account whose login duration is longer than a certain duration, or whose product purchase rate is higher than a certain probability, and the like.
Referring to fig. 4, which is a flowchart illustrating an information display method from the perspective of the first user terminal, the method may include the following steps:
401: and providing a first area and a first interactive control in the live interface, and displaying a first video picture in the first area.
402: and responding to the triggering operation aiming at the first interactive control, establishing an interactive connection channel with the main broadcasting end through the server end, and providing a second area in the live broadcasting interface.
Wherein providing the second area in the live interface may include: and providing a second area and the first area in the live interface, and displaying the first area in the second area in an overlapping manner so as to cover partial area in the second area.
Of course, the second area may be displayed in the first area in an overlapping manner to cover a partial area in the first area. In addition, the first area and the second area may also be displayed in a split manner in the live interface, and the first area and the second area are not covered with each other, and the like.
403: and collecting a third video picture and sending the third video picture to the server.
Optionally, after acquiring the third video picture, the third video picture may be displayed in the second area.
404: and displaying trial prompt information corresponding to at least one product associated with the live broadcast room in a live broadcast interface.
Optionally, a trial control may be displayed in the live interface, and step 404 is executed after the trigger operation for the trial control is detected.
405: and responding to the triggering operation aiming at any trial prompt information, sending a trial request to the server based on the target product corresponding to any trial prompt information so that the server can identify the target object corresponding to the target product in the third video picture, fusing the product effect of the target product into the third video picture based on the target object to obtain a second video picture, and providing the second video picture to the first user side.
Optionally, the server may specifically identify a target object feature portion corresponding to a target product in the third video image, fuse a product effect graph of the target product into the third video image based on the feature portion to obtain a second video image, and provide the second video image to the first user side.
406: a second video picture is displayed in the second area.
Optionally, after the second video picture is displayed in the second area, an operation control corresponding to the target product may be provided in the live interface, so that corresponding processing on the target product may be directly triggered.
For convenience of understanding, as shown in fig. 5a to fig. 5b, interface display diagrams of a live interface provided by a first user end in an actual application are respectively shown, a live interface 500 of fig. 5a includes a first area 501, and displays a first video picture in the first area 501, in addition, a first interaction control 502 may also be included in the live interface 500, and the first interaction control 502 may be used to prompt a user to interact with a main broadcast, so as to establish an interaction connection channel.
After detecting the trigger operation for the first interactive control 501, an interactive connection channel with the anchor terminal may be established through the server terminal, as shown in fig. 5b, a second area 502 may be provided in the live interface 500, and at the same time, the first area 501 may be displayed in a boundary area of the second area 502 in an overlapping manner, so that the first area 501 is displayed in a small window form, and the second area 502 is displayed in a large window form, but in practical applications, the second area 502 may also be displayed in a small window form based on a user switching operation, and the first area 501 is displayed in a large window form, that is, the second area 502 is displayed in a boundary area of the first area 501 in an overlapping manner based on the user switching operation, in addition, the area displayed in a small window form may also be adjusted in any way, for example, the first area 501 is adjusted from the first boundary area to the second boundary area in the second area 502 for display, and the like, the display form is not particularly limited in the present application.
After the first user end establishes the interactive connection channel with the anchor end, the first user end can acquire the site where the first user end is located to obtain a third video picture, as shown in fig. 5b, the third video picture can be displayed in the live broadcast interface 500. In addition, a trial control 503 may be displayed in the live interface 500, and a trigger operation for the trial control 503 is detected, as shown in fig. 5c, trial prompt information 504 corresponding to at least one product may be displayed in the live interface 500, and a user may select any one of the trial prompt information 504, and based on a target product corresponding to the any one of the trial prompt information, a target object corresponding to the target product in the third video screen may be identified, assuming that the target product is a hat and the target object is a human body; and the special part corresponding to the target product in the human body is identified, and the head is assumed, so that the product effect graph 505 corresponding to the target product can be mapped to the head part of the human body, thereby forming the virtual reality effect of wearing the hat on the human body and obtaining a second video picture.
Therefore, the second video picture can be displayed in real time in the second area in the live broadcast interface, so that a user can conveniently check the virtual trial effect of the target product. Certainly, the user may also perform a trigger operation for other trial prompt information to switch to display virtual trial effects corresponding to different products and update the second video picture.
In addition, as shown in fig. 5c, operation controls corresponding to the target product may also be provided in the live interface 500, such as a purchase adding control 506 and a transaction control 507.
In addition, as shown in fig. 5b and fig. 5c, an interaction canceling control 508 may also be displayed in the live interface, and when a trigger operation for the interaction canceling control is detected, an interaction connection channel between the first user side and the anchor side may be disconnected.
Fig. 5d to 5f are schematic interface diagrams illustrating a live interface provided by the second user, and fig. 5d is a schematic interface diagram before the first user establishes an interactive connection channel with the anchor, which may be the same as the schematic interface diagram of fig. 5 a. Fig. 5e is an interface schematic diagram showing a third video image after the first user side and the anchor side establish an interactive connection channel, which is different from fig. 5b in that a trial control, an interactive cancellation control, and the like may not be included, and fig. 5f is an interface schematic diagram showing a first video image after the first user side and the anchor side establish an interactive connection channel, which is different from fig. 5c in that an interactive cancellation control, and the like may not be included.
It should be noted that, an interactive connection channel may be allowed to be established between one first user end or multiple first user ends and the anchor end at the same time, and under the condition that the interactive connection channel is allowed to be established between multiple first user ends and the anchor end, multiple second regions corresponding to one first user end respectively may be provided in the live broadcast interface, the multiple second regions may be superimposed in the first region for display, and the execution operations corresponding to each first user end are the same, which may be described in the foregoing for details, and are not described herein again.
In another implementation, the technical solution of the embodiment shown in fig. 2 may be executed by the anchor side, and therefore, in some embodiments, the method may further include:
displaying second interactive controls corresponding to different users in a live broadcast interface;
and responding to the trigger operation of the anchor aiming at any second interactive control, and establishing an interactive connection channel corresponding to any second interactive control and the first user side through the server side.
That is, the anchor terminal can actively initiate the interactive operation with the first user terminal.
Optionally, in response to a trigger operation of the anchor for any second interactive control, establishing, by the server, an interactive connection channel of the first user side corresponding to any second interactive control may include:
and responding to the trigger operation of the anchor to any second interactive control, sending a second interactive request to the server so that the server sends second interactive prompt information to the first user side corresponding to any second interactive control, and establishing an interactive connection channel between the first user side and the anchor side based on a second confirmation request fed back by the first user side.
The first user side can display the second interaction prompt information in the live broadcast interface after obtaining the second interaction prompt information, and can feed back a second confirmation request to the server side after detecting the confirmation operation aiming at the second interaction prompt information.
After the server establishes the interactive connection channel between the first user side and the anchor side, a second interactive instruction can be fed back to the first user side, and the first user side responds to the second interactive instruction, can acquire a third video picture and sends the third video picture to the server.
The server side can provide the third video picture to the anchor side and the second user side, the anchor side and the second user side can provide a second area in respective live broadcast interfaces, and the third video picture can be displayed in the second area firstly until the second video picture is obtained.
In some embodiments, after the anchor provides the second area in the live interface, the method further includes:
displaying trial prompt information corresponding to at least one product associated with the live broadcast room in a live broadcast interface;
and responding to the triggering operation aiming at any trial prompt information, sending a trial request to the server based on the target product corresponding to any trial prompt information so that the server can identify the characteristic part of the target object in the third video picture, and fusing the product effect graph of the target product to the third video picture based on the characteristic part to obtain the second video picture.
The server can provide the second video picture to the anchor terminal, the first user terminal and the second user terminal.
Optionally, the service end may specifically perform mixed flow processing on the first video picture and the second video picture, and then provide the mixed flow picture to the anchor end, the first user end, and the second user end, so that the anchor end may display the second video picture in the second area and the first video picture in the first area after obtaining the mixed flow picture.
As can be seen from the foregoing description, it may also be that the first user terminal actively initiates an interactive operation with the anchor terminal, and therefore, in some embodiments, the method may further include:
displaying first interaction prompt information in a live interface, wherein the first interaction prompt information is sent by a server side after receiving a first interaction request of a first user side;
and responding to the confirmation operation aiming at the first interaction prompt message, sending a first confirmation request to the server side, so that the server side establishes an interaction connection channel between the anchor side and the first user side based on the first confirmation request, and indicating the first user side to collect a third video picture.
Optionally, after the server establishes the interactive connection channel between the anchor terminal and the first user terminal, the server may send a first interactive instruction to the first user terminal, and the first user terminal may provide a first area in a live interface thereof in response to the first interactive instruction, and acquire a third video picture, and the like.
As shown in fig. 6, which is a flowchart of an information display method described from the point of view of the anchor side execution, the method may include the following steps:
601: and providing a first area and second interactive controls corresponding to different users in a live interface, and displaying a first video picture in the first area.
602: and responding to the triggering operation aiming at any second interactive control, establishing an interactive connection channel corresponding to any second interactive control and the first user side through the server side, and providing a second area in the live broadcast interface.
603: and displaying trial prompt information corresponding to at least one product associated with the live broadcast room in a live broadcast interface.
604: and responding to the triggering operation aiming at any trial prompt information, sending a trial request to the server based on a target product corresponding to any trial prompt information so that the server can identify a target object corresponding to the target product in a third video picture acquired by the first user side, fusing the product effect of the target product into the third video picture based on the target object to obtain a second video picture, and providing the second video picture to the anchor terminal.
Optionally, the server may specifically identify a target object feature part corresponding to a target product in a third video image acquired by the first user, fuse a product effect graph of the target product into the third video image based on the feature part to obtain a second video image, and provide the second video image to the anchor terminal.
605: a second video picture is displayed in the second area.
Referring to fig. 7, a flowchart of an information display method described from the second user side implementation perspective may include the following steps:
701: and acquiring a first video picture and a second video picture provided by the server.
The first video picture is acquired by a main broadcasting terminal; the second video picture is obtained by identifying a target object from the third video picture and fusing the product effect of a target product associated with the live broadcast room into the third video picture based on the target object; and the third video picture is acquired by the first user terminal.
Optionally, the second video picture may specifically be obtained by identifying a feature of the target object from the third video picture, and fusing a product effect graph of the target product associated with the live broadcast room to the third video picture based on the feature;
702: and providing a first area and a second area in the live interface.
703: a first video picture is displayed in the first area, and a second video picture is displayed in the second area.
Optionally, an operation control corresponding to the target product may be further provided in the live interface, and therefore, in some embodiments, the method may further include:
displaying an operation control corresponding to a target product in a live broadcast interface;
and responding to the trigger operation aiming at the operation control, and performing corresponding processing on the target product.
The specific implementation form of the operation control may be detailed as above, and is not described herein again.
Fig. 8 is a flowchart of an embodiment of an information processing method provided in an embodiment of the present application, and the embodiment is described from the perspective of server side execution, where the method may include the following steps:
801: and establishing an interactive connection channel between the first user terminal and the anchor terminal.
As an alternative, establishing the interactive connection channel between the first user end and the anchor end may include:
receiving a first interaction request sent by a first user end, sending first interaction prompt information to a main broadcast end so that the main broadcast end can output the first interaction prompt information, and generating a first confirmation request based on confirmation operation aiming at the first interaction prompt information;
and receiving a first confirmation request sent by the anchor terminal, and establishing an interactive connection channel between the first user terminal and the anchor terminal.
The first interaction request can be generated by providing a first interaction control in a live interface of the first user terminal and responding to a trigger operation aiming at the first interaction control.
After the server establishes an interactive connection channel between the first user terminal and the anchor terminal, a first interactive instruction can be sent to the first user terminal.
As another alternative, establishing the interactive connection channel between the first user end and the anchor end may include:
receiving a second interaction request sent by the anchor terminal, sending second interaction prompt information to the corresponding first user terminal so that the first user terminal can output the second interaction prompt information, and generating a second confirmation request based on confirmation operation aiming at the second interaction prompt information;
and receiving a second confirmation request sent by the first user terminal, and establishing an interactive connection channel between the first user terminal and the anchor terminal.
The second interaction request may be generated by displaying second interaction controls corresponding to different users in a live interface of the anchor terminal and responding to a trigger operation for any one of the second interaction controls.
After the server establishes the interactive connection channel between the first user terminal and the anchor terminal, a second interactive instruction can be sent to the first user terminal.
802: and acquiring a first video picture acquired by the anchor terminal.
The anchor terminal will collect the first video frame in real time and upload it to the server terminal, so the operation of step 802 is not limited to the appearance sequence in this embodiment.
803: and acquiring a third video picture acquired by the first user side, and identifying a target object in the third video picture.
After the first user side and the anchor side establish an interactive connection channel, a third video picture can be acquired in real time and sent to the server side.
The server side can also provide the third video picture to the anchor side and the second user side, so that the third video picture can be displayed in a live interface of the anchor side and a live interface of the second user side.
As an alternative, identifying the target object in the third video picture may include:
sending trial prompt information of at least one product to a first user end so that the first user end can output the trial prompt information of the at least one product, and sending a trial request aiming at a target product corresponding to any one of the trial prompt information based on the trigger operation of any one of the trial prompt information;
receiving a trial request sent by a first user terminal;
and identifying a target object corresponding to the target product in the third video picture.
The trial request may include information such as a product identifier of the target product, so as to determine the corresponding target product.
As another alternative, identifying the target object in the third video picture may include:
the method comprises the steps that trial prompt information of at least one product is sent to a broadcaster side, so that the broadcaster side can output the trial prompt information of the at least one product, and a trial request for a target product corresponding to any one of the trial prompt information is sent based on the triggering operation of any one of the trial prompt information;
receiving a trial request sent by a first user terminal;
and identifying a target object corresponding to the target product in the third video picture.
The trial request may include information such as a product identifier of the target product, so as to determine the corresponding target product.
804: and based on the target object, fusing the product effect of the target product associated with the live broadcast room into the third video picture to obtain a second video picture.
Optionally, in step 803, specifically, a feature portion of a target object corresponding to a target product in the third video image may be identified, and in step 804, a product effect graph of the target product associated with the live broadcast may be fused to the third video image based on the feature portion to obtain the second video image.
805: and providing the first video picture and the second video picture to the second user end so as to enable the second user end to display the first video picture and the second video picture.
The second user terminal may provide a first area and a second area in its live interface, so as to display the first video picture in the first area and the second video picture in the second area.
In addition, the server can also provide the first video picture and the second video picture to the first user terminal and the anchor terminal, so that the anchor terminal and the first user terminal can display the first video picture and the second video picture.
The anchor terminal or the first user terminal can provide a first area and a second area in a live interface thereof, so that the first video picture is displayed in the first area and the second video picture is displayed in the second area.
The server may perform mixed flow processing on the first video picture and the second video picture, synthesize the mixed flow picture, and provide the mixed flow picture to the second user side, or the second user side may respectively pull the first video picture and the second video picture from the server side, perform mixed flow processing, and then perform display.
It should be noted that the live interface of the anchor terminal, the first user terminal, or the second user terminal may be generated and provided by the server terminal, and of course, the anchor terminal, the first user terminal, or the second user terminal may locally generate respective corresponding live interfaces in combination with the display content provided by the server terminal.
In some embodiments, the method may further comprise:
providing an operation control corresponding to the target product for the second user end to display the operation control of the target product;
and receiving a processing request which is sent by a second user side and is triggered based on the operation control, and carrying out corresponding processing on the target product.
The specific implementation manner of the operation control can be detailed in the foregoing, and is not described herein again.
In addition, the server side can also provide the first user side with an operation control corresponding to the target product, so that the first user side can display the operation control of the target product; and receiving a processing request which is sent by the first user side and is triggered based on the operation control, and carrying out corresponding processing on the target product.
In a practical application, the technical solution in the embodiment of the present application may be applied to a scene of promoting or selling a beauty product in a live webcast form, and therefore, as another embodiment, the present application further provides an information display method, including:
providing a first area in a live interface of a live broadcast room;
displaying a first video picture in the first area; the first video picture is acquired by a main broadcasting terminal;
providing a second area in the live interface; displaying a makeup trial picture in the second area; the makeup trial picture is obtained by identifying a face characteristic part from a third video picture and fusing a makeup effect picture of a target makeup product associated with the live broadcast room into the third video picture based on the face characteristic part; and the third video picture is acquired by the first user terminal.
When the product is a makeup product, the second video picture obtained by fusion is specifically a makeup trying picture, and the face feature part corresponding to the target makeup product can be determined by carrying out face recognition on the user in the third video picture.
As still another embodiment, the present application further provides an information display method including:
a first user end provides a first area and a microphone connecting control in a live broadcast interface, and displays a first video picture in the first area;
responding to the triggering operation aiming at the microphone connecting control, establishing a microphone connecting interaction channel with a main broadcasting end through a server end, and providing a second area in the live broadcasting interface;
collecting a third video picture and sending the third video picture to the server;
displaying makeup test prompt information corresponding to at least one makeup product associated with the live broadcast room in the live broadcast interface;
responding to a trigger operation aiming at any makeup trying prompt message, sending a makeup trying request to a server based on a target makeup product corresponding to any makeup trying prompt message so that the server can identify a face feature part corresponding to the target makeup product in the third video picture, and fusing a makeup effect image of the target makeup product to the third video picture based on the face feature part to obtain a makeup trying picture;
displaying the makeup trial picture in the second area.
As still another embodiment, the present application further provides an information display method including:
the method comprises the steps that a main broadcast end provides a first area and microphone connecting controls corresponding to different users in a live broadcast interface, and a first video picture is displayed in the first area;
responding to the triggering operation aiming at any one microphone connecting control, establishing a microphone connecting interaction channel corresponding to the first user side of the any microphone connecting control through a server side, and providing a second area in the live broadcast interface;
displaying makeup test prompt information corresponding to at least one makeup product associated with the live broadcast room in the live broadcast interface;
responding to a trigger operation aiming at any makeup test prompt message, sending a makeup test request to a server based on a target makeup product corresponding to any makeup test prompt message so that the server can identify a face feature part corresponding to the target makeup product in a third video picture acquired by a first user side, and fusing a makeup effect image of the target makeup product into the third video picture based on the face feature part to obtain a makeup test picture;
displaying the makeup trial picture in the second area.
As still another embodiment, the present application further provides an information display method including:
the second user side obtains a first video picture and a makeup trial picture provided by the server side; the first video picture is acquired by a main broadcasting terminal; the makeup trial picture is obtained by identifying a face characteristic part from a third video picture and fusing a makeup effect picture of a target makeup product associated with the live broadcast room into the third video picture based on the face characteristic part; the third video picture is acquired by the first user terminal;
providing a first area and a second area in a live broadcast interface;
displaying the first video picture in the first area, and displaying the makeup trial picture in the second area.
As still another embodiment, the present application further provides an information processing method including:
establishing a connecting wheat interaction channel between a first user terminal and a main broadcasting terminal;
acquiring a first video picture acquired by the anchor terminal;
acquiring a third video image acquired by the first user side, and identifying a face characteristic part in the third video image;
based on the face feature part, fusing a makeup effect picture of a target makeup product associated with a live broadcast room into the third video picture to obtain a makeup trial picture;
and providing the first video picture and the makeup trial picture to a second user end so that the second user end can display the first video picture and the makeup trial picture.
That is, in the live scene of the makeup product, the first interactive control or the second interactive control may specifically refer to a microphone connecting control, the interactive connecting channel may specifically refer to a microphone connecting interactive channel, the product effect diagram may be a makeup effect diagram, and the second video picture may specifically refer to a makeup trying picture fused with the makeup effect diagram. The target object may specifically refer to a face feature, and other identical or corresponding steps may be described in detail in the foregoing embodiments, and are not described herein again.
Because at present, the director explains the makeup product on the live broadcast site usually, tries if necessary to promote the makeup product to the user, and then the user can't produce the actual impression to the makeup product. By adopting the technical scheme in the embodiment of the application, a certain user watching a live broadcast can request to perform a microphone connecting interaction with a main broadcast, certainly, the main broadcast can also request a certain user to perform the microphone connecting interaction, after mutual agreement is obtained, the main broadcast can establish a microphone connecting interaction channel with a first user end corresponding to a microphone connecting user, the first user end can acquire a third video picture of the scene where the microphone connecting user is located, so that the first video picture acquired by the main broadcast and the third video picture acquired by the first user end can be simultaneously displayed in a live broadcast interface, and watching users of the main broadcast, the microphone connecting user and a non-microphone connecting user can view the first video picture and the third video picture in the live broadcast interface, in order to realize effective popularization of products, a certain target cosmetic product can be selected by the microphone connecting user or the main broadcast, and the target cosmetic product is supposed to be red with an A color number, face recognition can be carried out on the third video picture, lip parts in the face area are recognized, then the lipstick effect graph with the color corresponding to the color number A can be mapped to the lip parts in the third video picture, a makeup trial picture is obtained, the face image coated with the color number A can be presented in the makeup trial picture, and a virtual makeup trial effect is presented. The second video picture is displayed in the live interface, the main broadcasting, the wheat connecting user and the watching user can check the virtual trial makeup effect of the lipstick with the color number A through the trial makeup picture, the actual trial makeup effect of experiencing the lipstick is generated, and therefore the interactive user and the watching user can further know the target product, the purpose that the interactive user and the watching user purchase the target product can be stimulated, the product conversion rate is improved, and the product popularization effect is improved.
Certainly, this application is not limited to the popularization to cosmetic products, other like clothes, shoes and hats, accessories, electronic equipment or other tangible products etc. can all adopt the technical scheme of this application embodiment to promote, the target object is also not limited to the human body, as long as there is the object of trying that the target product corresponds in first user end collection scope all can realize according to the technical scheme in this application embodiment, so that can carry out virtual trying to the product, make the user produce the actual experience impression, help improving product promotion effect, improve product conversion rate etc..
As can be seen from the embodiment shown in fig. 1, the present application further provides a live webcast system, where the live webcast system may include a server, an anchor, a first user, and a second user;
the server is used for establishing an interactive connection channel between the anchor terminal and the first user terminal; acquiring a first video picture acquired by a main broadcasting end; acquiring a third video picture acquired by the first user side, and identifying a target object in the third video picture; based on the target object, fusing the product effect of the target product associated with the live broadcast room into a third video picture to obtain a second video picture; optionally, the server may specifically identify a feature of a target object in the third video frame, and based on the feature, fuse a product effect graph of a target product associated with the live broadcast room into the third video frame to obtain the second video frame;
the first user side is used for acquiring a third video picture after an interactive connection channel with the anchor side is established, and sending the third video picture to the server side; acquiring a first video picture and a second video picture from a server, and displaying the first video picture and the second video picture;
and the second user side is used for acquiring the first video picture and the second video picture from the server side and displaying the first video picture and the second video picture.
Optionally, the anchor terminal may also acquire the second video picture from the server terminal, and display the first video picture and the second video picture.
In yet another embodiment, after the first user side acquires the third video picture, a product effect graph of a target product associated with the live broadcast room is acquired from the server side, and a target object characteristic part in the third video picture is identified; fusing the product effect graph into a third video picture to obtain a second video picture based on the characteristic part; and providing the second video picture to the server. The first user terminal obtains the first video picture from the server terminal, and can simultaneously display the first video picture and the second video picture.
The operations that the server, the first user, the second user, and the anchor may perform have been described in detail in the foregoing embodiments, and will not be described repeatedly herein.
Fig. 9 is a schematic structural diagram of an embodiment of an information display apparatus provided in the present application, where the apparatus may be configured in an electronic device, such as a mobile phone, a tablet computer, a notebook computer, and the like, to implement corresponding functional operations, and the apparatus may include:
a first display module 901, configured to provide a first area in a live interface of a live broadcast room; displaying a first video picture in a first area; the first video picture is acquired by a main broadcasting terminal;
a second display module 902, configured to provide a second area in the live interface; displaying a second video picture in a second area; the second video picture is obtained by identifying a target object from the third video picture and fusing the product effect of a target product associated with the live broadcast room into the third video picture based on the target object; and the third video picture is acquired by the first user terminal.
In some embodiments, the second display module is further configured to display an operation control corresponding to the target product in the live interface; the apparatus may further include:
and the first processing module is used for responding to the triggering operation aiming at the operation control and correspondingly processing the target product.
In some embodiments, the operational controls include processing controls corresponding to at least one processing type;
the first processing module may be specifically configured to, in response to a trigger operation for a processing control of any processing type, perform corresponding processing on the target product according to the processing type.
In some embodiments, the processing controls may include a buy-in control.
The first processing module may be specifically configured to add the target product to the shopping cart in response to a triggering operation for the shopping control.
In some embodiments, the processing control comprises a transaction control;
the first processing module may be specifically configured to generate an order placement request based on the target product in response to a triggering operation for the transaction control.
In some embodiments, the first processing module may be specifically configured to, in response to a trigger operation for the operation control, display the target product into a corresponding product description page.
In some embodiments, the first display module is further configured to provide a first interactive control in the live interface; responding to the triggering operation aiming at the first interactive control, and establishing an interactive connection channel with the anchor terminal through the server terminal;
the apparatus may further include:
and the first acquisition module is used for acquiring the third video picture and sending the third video picture to the server.
In some embodiments, the establishing, by the first display module and in response to the triggering operation of the first user for the first interactive control, an interactive connection channel with the anchor terminal through the server terminal may include: and responding to the triggering operation of the first user for the first interaction control, sending a first interaction request to the server side so that the server side sends first interaction prompt information to the anchor side, and establishing an interaction connection channel between the first user side and the anchor side based on a first confirmation request fed back by the anchor side.
In some embodiments, the second display module may be further configured to display a third video picture in the second area before displaying the second video picture in the second area.
In some embodiments, the second display module is further configured to display trial prompt information corresponding to each of at least one product associated with the live broadcast room in the live broadcast interface; and responding to the triggering operation aiming at any trial prompt information, sending a trial request to the server based on the target product corresponding to any trial prompt information so that the server can identify the target object characteristic part corresponding to the target product in the third video picture, and fusing the product effect graph of the target product to the third video picture based on the characteristic part to obtain the second video picture.
In some embodiments, the second display module is further configured to display trial prompt information corresponding to at least one product associated with the live broadcast room in the live broadcast interface; responding to the trigger operation aiming at any trial prompt information, and acquiring a product effect graph of a target product from the server based on the target product corresponding to any trial prompt information; and fusing the product effect picture into a third video picture to obtain a second video picture.
In addition, the second display module is further configured to send the second video image to the server, so that the server provides the second video image to a second user meeting a predetermined requirement.
In some embodiments, the first display module is further configured to display a second interactive prompt message in the live interface; the second interaction prompt message is sent by the server side after receiving a second interaction request of the anchor side; responding to the confirmation operation aiming at the second interaction prompt message, and sending a second confirmation request to the server;
the apparatus may further include:
and the second acquisition module is used for acquiring a third video picture, sending the third video picture to the server, identifying a target object characteristic part corresponding to the target product from the third video picture by the server based on a trial request of the anchor terminal for the target product associated with the live broadcast room, and fusing a product effect graph of the target product to the third video picture based on the characteristic part to obtain a second video picture.
In some embodiments, the first display module is further configured to display a second interactive control corresponding to a different user in the live interface; and responding to the trigger operation of the anchor aiming at any second interactive control, and establishing an interactive connection channel of the first user end corresponding to any second interactive control through the server end.
In some embodiments, the establishing, by the first display module and in response to a trigger operation of the anchor for any one of the second interactive controls, an interactive connection channel of the first user side corresponding to any one of the second interactive controls through the server side includes: and responding to the trigger operation of the anchor to any second interactive control, sending a second interactive request to the server so that the server sends second interactive prompt information to the first user side corresponding to any second interactive control, and establishing an interactive connection channel between the first user side and the anchor side based on a second confirmation request fed back by the first user side.
In some embodiments, the second display module is further configured to display trial prompt information corresponding to at least one product associated with the live broadcast room in the live broadcast interface; and responding to the triggering operation aiming at any trial prompt information, sending a trial request to the server based on the target product corresponding to any trial prompt information so that the server can identify the characteristic part of the target object in the third video picture, and fusing the product effect graph of the target product to the third video picture based on the characteristic part to obtain the second video picture.
In some embodiments, the first display module is further configured to display a first interaction prompt message in the live interface, where the first interaction prompt message is sent by the server when the server receives a first interaction request from the first user; and responding to the confirmation operation aiming at the first interaction prompt information, and sending a first confirmation request to the server side so that the server side can establish an interaction connection channel between the anchor side and the first user side and instruct the first user side to collect a third video picture.
Optionally, the second display module providing the second area in the live interface may include: and providing a second area in the live interface, and displaying the first area in the second area in an overlapping mode.
The information display apparatus shown in fig. 9 may execute the information display method shown in the embodiment shown in fig. 2, and the implementation principle and the technical effect are not repeated. The specific manner in which each module and unit of the information display device in the above embodiments perform operations has been described in detail in the embodiments related to the method, and will not be described in detail here.
Fig. 10 is a schematic structural diagram of an embodiment of an information display apparatus provided in the present application, where the apparatus may be configured in an electronic device, such as a mobile phone, a tablet computer, a notebook computer, and the like, to implement corresponding functional operations, and the apparatus may include:
a third display module 1001, configured to provide a first area and a first interactive control in a live interface, and display a first video frame in the first area;
a first response module 1002, configured to, in response to a trigger operation for a first interactive control, establish an interactive connection channel with a main broadcast terminal through a server, and provide a second area in a live broadcast interface;
the first acquisition module 1003 is configured to acquire a third video image and send the third video image to the server;
a fourth display module 1004, configured to display trial prompt information corresponding to at least one product associated with the live broadcast room in the live broadcast interface; responding to the trigger operation aiming at any trial prompt information, sending a trial request to the server based on a target product corresponding to any trial prompt information so that the server can identify a target object corresponding to the target product in the third video picture, and fusing the product effect of the target product to the third video picture based on the target object to obtain a second video picture; a second video picture is displayed in the second area.
The information display apparatus shown in fig. 10 may execute the information display method shown in the embodiment shown in fig. 4, and the implementation principle and the technical effect are not repeated. The specific manner in which each module and unit of the information display device in the above embodiments perform operations has been described in detail in the embodiments related to the method, and will not be described in detail here.
Fig. 11 is a schematic structural diagram illustrating a further embodiment of an information display apparatus provided in the present application, where the apparatus may be configured in an electronic device, such as a mobile phone, a tablet computer, a notebook computer, etc., to implement corresponding functional operations, and the apparatus may include:
a fifth display module 1101, configured to provide a first area and second interactive controls corresponding to different users in a live interface, and display a first video frame in the first area;
a second response module 1102, configured to, in response to a trigger operation for any second interactive control, establish, through a server, an interactive connection channel with a first user side corresponding to any second interactive control, and provide a second area in a live interface;
a sixth display module 1103, configured to display trial prompt information corresponding to at least one product associated with a live broadcast room in a live broadcast interface; responding to the trigger operation aiming at any trial prompt information, sending a trial request to the server based on a target product corresponding to any trial prompt information so that the server can identify a target object corresponding to the target product in a third video picture acquired by the first user side, and fusing the product effect of the target product into the third video picture to obtain a second video picture based on the target object; a second video picture is displayed in the second area.
The information display apparatus in fig. 11 may execute the information display method described in the embodiment shown in fig. 6, and the implementation principle and the technical effect are not repeated. The specific manner in which each module and unit of the information display device in the above embodiments perform operations has been described in detail in the embodiments related to the method, and will not be described in detail here.
Fig. 12 is a schematic structural diagram illustrating a further embodiment of an information display apparatus provided in the present application, where the apparatus may be configured in an electronic device, such as a mobile phone, a tablet computer, a notebook computer, etc., to implement corresponding functional operations, and the apparatus may include:
a first obtaining module 1201, configured to obtain a first video picture and a second video picture provided by a server; the first video picture is acquired by a main broadcasting terminal; the second video picture is obtained by identifying a target object from the third video picture and fusing a product effect graph of a target product associated with the live broadcast room into the third video picture based on the target object; the third video picture is acquired by the first user terminal;
a seventh display module 1202, configured to provide the first area and the second area in the live interface; a first video picture is displayed in the first area, and a second video picture is displayed in the second area.
The information display apparatus shown in fig. 12 may execute the information display method shown in the embodiment shown in fig. 7, and the implementation principle and the technical effect are not repeated. The specific manner in which each module and unit of the information display device in the above embodiments perform operations has been described in detail in the embodiments related to the method, and will not be described in detail here.
Fig. 13 shows a schematic structural diagram of an embodiment of an electronic device provided in the present application, where the electronic device may include a storage component 1301, a display component 1302, and a processing component 1303; storage component 1301 stores one or more computer program instructions; the one or more computer program instructions are called and executed by the processing component 1303, so as to implement the information display method according to the embodiment shown in fig. 2.
The processing component 1303 may include one or more processors executing computer instructions to perform all or part of the steps of the above method. Of course, the processing elements may also be implemented as one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components configured to perform the above-described methods.
The storage component 1301 is configured to store various types of data to support operations in the electronic device. The memory components may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The display element 1302 may be an Electroluminescent (EL) element, a liquid crystal display or a microdisplay having a similar structure, or a retina-directly or similar laser scanning type display.
Of course, the electronic device may of course also comprise other components, such as input/output interfaces, communication components, etc.
In practical application, the electronic device may refer to an intelligent terminal such as a mobile phone, a tablet computer, a computer, an intelligent watch and the like,
the electronic device may be configured with the anchor terminal, the first user terminal, or the second user terminal, and when the first user terminal is configured, the electronic device may specifically execute the information display method shown in fig. 4; when the anchor terminal is configured, the electronic device may specifically execute the information display method shown in fig. 6; when the second user terminal is configured, the electronic device may specifically perform the information display method as shown in fig. 7.
In addition, the embodiment of the application also provides a live interface, and the live interface can comprise a first area and a second area;
the first area is used for displaying a first video picture; the first video picture is acquired by a main broadcasting terminal;
the second area is used for displaying a second video picture; the second video picture is obtained by identifying a target object from the third video picture and fusing the product effect of a target product associated with the live broadcast room into the third video picture based on the target object position; and the third video picture is acquired by the first user terminal.
Possible implementations of the live interface can be seen in fig. 3a to 3c and fig. 5a to 5f, but the application is not limited thereto.
In addition, an embodiment of the present application further provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a computer, the information display method of the embodiment shown in fig. 2, fig. 4, fig. 6, or fig. 7 may be implemented.
Fig. 14 is a schematic structural diagram of an embodiment of an information processing apparatus according to an embodiment of the present application, where the apparatus may include:
an interactive triggering module 1401, configured to establish an interactive connection channel between a first user side and a host side;
a second obtaining module 1402, configured to obtain a first video picture collected by a anchor terminal; acquiring a third video picture acquired by the first user side;
an identifying module 1403, configured to identify a target object in the third video picture;
a fusion module 1404, configured to fuse, based on the target object, a product effect of a target product associated with the live broadcast room into a third video frame to obtain a second video frame;
the providing module 1405 is configured to provide the first video frame and the second video frame to the second user end, so that the second user end can display the first video frame and the second video frame.
In some embodiments, the interaction triggering module may be specifically configured to receive a first interaction request sent by a first user, send first interaction prompt information to the anchor, allow the anchor to output the first interaction prompt information, and generate a first confirmation request based on a confirmation operation for the first interaction prompt information; and receiving a first confirmation request sent by the anchor terminal, and establishing an interactive connection channel between the first user terminal and the anchor terminal.
In some embodiments, the interaction triggering module may be specifically configured to receive a second interaction request sent by the anchor terminal, send second interaction prompt information to the corresponding first user terminal, so that the first user terminal outputs the second interaction prompt information, and generate a second confirmation request based on a confirmation operation for the second interaction prompt information; and receiving a second confirmation request sent by the first user terminal, and establishing an interactive connection channel between the first user terminal and the anchor terminal.
In some embodiments, the identification module is specifically configured to send trial prompt information of at least one product to the first user end, so that the first user end outputs the trial prompt information of the at least one product, and send a trial request for a target product corresponding to any one of the trial prompt information based on a trigger operation of any one of the trial prompt information; receiving a trial request sent by a first user terminal; and identifying the characteristic part of the target object corresponding to the target product in the third video picture.
In some embodiments, the identification module is specifically configured to send trial prompt information of at least one product to the anchor terminal, so that the anchor terminal outputs the trial prompt information of the at least one product, and based on a trigger operation of any one of the trial prompt information, sends a trial request for a target product corresponding to any one of the trial prompt information; receiving a trial request sent by a main broadcasting terminal; and identifying the characteristic part of the target object corresponding to the target product in the third video picture.
In some embodiments, the providing module is further configured to provide the first video frame and the second video frame to the first user terminal and the anchor terminal, so that the anchor terminal and the first user terminal can display the first video frame and the second video frame.
In some embodiments, the providing module is further configured to provide an operation control corresponding to the target product to the second user end, so that the second user end displays the operation control of the target product; and receiving a processing request which is sent by a second user side and is triggered based on the operation control, and carrying out corresponding processing on the target product.
The information processing apparatus shown in fig. 14 can execute the information processing method shown in the embodiment shown in fig. 8, and the implementation principle and the technical effect are not described again. The specific manner in which each module and unit of the information display device in the above embodiments perform operations has been described in detail in the embodiments related to the method, and will not be described in detail here.
In one possible design, the information processing apparatus in the embodiment shown in fig. 14 may be implemented as a computing device, which may be a component device in the server in the embodiment shown in fig. 1, as shown in fig. 15, and may include a storage component 1501 and a processing component 1502;
the storage component 1501 stores one or more computer instructions for the processing component 1502 to invoke for execution to implement the information processing method shown in fig. 8.
Among other things, the processing component 1502 may include one or more processors executing computer instructions to perform all or some of the steps of the methods described above. Of course, the processing elements may also be implemented as one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components configured to perform the above-described methods.
The storage component 1501 is configured to store various types of data to support operations on a computing device. The memory components may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Of course, a computing device may also necessarily include other components, such as input/output interfaces, communication components, and so forth.
The input/output interface provides an interface between the processing components and peripheral interface modules, which may be output devices, input devices, etc.
The communication component is configured to facilitate wired or wireless communication between the computing device and other devices, and the like.
The computing device may be a physical device or an elastic computing host provided by a cloud computing platform in actual application, and the computing device may be a cloud server, and the processing component, the storage component, and the like may be basic server resources rented or purchased from the cloud computing platform.
In addition, an embodiment of the present application further provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a computer, the computer program can implement the information processing method according to the embodiment shown in fig. 8.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.
Claims (43)
1. An information display method, comprising:
providing a first area in a live interface of a live broadcast room;
displaying a first video picture in the first area; the first video picture is acquired by a main broadcasting terminal;
providing a second area in the live interface; displaying a second video picture in the second area; the second video picture is obtained by identifying a target object from a third video picture and fusing a product effect of a target product associated with the live broadcast room into the third video picture based on the target object; and the third video picture is acquired by the first user terminal.
2. The method of claim 1, further comprising:
displaying an operation control corresponding to the target product in the live broadcast interface;
and responding to the trigger operation aiming at the operation control, and correspondingly processing the target product.
3. The method of claim 2, wherein the operational controls include process controls corresponding to at least one process type;
the responding to the triggering operation aiming at the operation control, and the corresponding processing of the target product comprises the following steps:
and responding to the triggering operation of the processing control aiming at any processing type, and performing corresponding processing on the target product according to the processing type.
4. The method of claim 3, wherein the processing control comprises a buy-in control;
the responding to the triggering operation of the processing control aiming at any processing type, and the corresponding processing of the target product according to the processing type comprises the following steps:
and responding to the triggering operation of the purchase adding control, and adding the target product to the shopping cart.
5. The method of claim 3, wherein the processing control comprises a transaction control;
the responding to the triggering operation of the processing control aiming at any processing type, and the corresponding processing of the target product according to the processing type comprises the following steps:
and generating an ordering request based on the target product in response to a triggering operation for the transaction control.
6. The method of claim 2, wherein the performing the corresponding processing on the target product in response to the triggering operation for the operation control comprises:
and responding to the triggering operation aiming at the operation control, and displaying the target product into a corresponding product description page.
7. The method of claim 1, further comprising:
providing a first interaction control in the live interface;
responding to the triggering operation aiming at the first interactive control, and establishing an interactive connection channel with a main broadcasting end through a server end;
and acquiring a third video picture, and sending the third video picture to the server.
8. The method of claim 7, wherein the establishing, by the server, an interactive connection channel with the anchor in response to the triggering operation of the first user on the first interactive control comprises:
and responding to the triggering operation of the first user for the first interaction control, sending a first interaction request to a server side for the server side to send first interaction prompt information to an anchor side, and establishing an interaction connection channel between the first user side and the anchor side based on a first confirmation request fed back by the anchor side.
9. The method of claim 7, wherein prior to displaying the second video picture in the second region, the method further comprises:
displaying the third video picture in the second region.
10. The method of claim 7, wherein prior to displaying the second video picture in the second region, the method further comprises:
displaying trial prompt information corresponding to at least one product associated with the live broadcast room in the live broadcast interface;
responding to a trigger operation aiming at any trial prompt information, sending a trial request to a server based on a target product corresponding to any trial prompt information so that the server can identify a target object characteristic part corresponding to the target product in the third video picture, and fusing a product effect graph of the target product to the third video picture based on the characteristic part to obtain the second video picture.
11. The method of claim 7, wherein prior to displaying the second video picture in the second region, the method further comprises:
displaying trial prompt information corresponding to at least one product associated with the live broadcast room in the live broadcast interface;
responding to the trigger operation aiming at any trial prompt information, and acquiring a product effect graph of a target product from a server based on the target product corresponding to any trial prompt information;
and fusing the product effect graph into the third video picture to obtain the second video picture.
12. The method of claim 11, further comprising:
and sending the second video picture to a server side so that the server side can provide the second video picture to a second user side meeting the preset requirement.
13. The method of claim 1, further comprising:
displaying second interaction prompt information in the live broadcast interface; the second interaction prompt message is sent by the server side after receiving a second interaction request of the anchor side;
responding to the confirmation operation aiming at the second interaction prompt message, and sending a second confirmation request to the server;
acquiring a third video picture, sending the third video picture to the server, so that the server can identify a target object characteristic part corresponding to a target product from the third video picture based on a trial request which is sent by the anchor terminal and aims at the target product associated with the live broadcast room, and fusing a product effect graph of the target product to the third video picture based on the characteristic part to obtain a second video picture.
14. The method of claim 1, wherein prior to providing the second region in the live interface, the method further comprises:
displaying second interactive controls corresponding to different users in the live broadcast interface;
and responding to the trigger operation of the anchor aiming at any second interactive control, and establishing an interactive connection channel of the first user end corresponding to any second interactive control through the server end.
15. The method of claim 14, wherein the establishing, by the server, an interactive connection channel of the first user side corresponding to any one of the second interactive controls in response to a trigger operation of an anchor for the any one of the second interactive controls comprises:
and responding to the trigger operation of the anchor to any second interactive control, sending a second interactive request to a server, so that the server sends second interactive prompt information to a first user end corresponding to any second interactive control, and establishing an interactive connection channel between the first user end and the anchor based on a second confirmation request fed back by the first user end.
16. The method of claim 14, wherein prior to displaying the second video picture in the second region, the method further comprises:
displaying trial prompt information corresponding to at least one product associated with the live broadcast room in the live broadcast interface;
responding to a trigger operation aiming at any trial prompt information, sending a trial request to a server based on a target product corresponding to any trial prompt information so that the server can identify the characteristic part of the target object in the third video picture, and fusing the product effect graph of the target product to the third video picture based on the characteristic part to obtain the second video picture.
17. The method of claim 1, wherein prior to providing the second region in the live interface, the method further comprises:
displaying first interaction prompt information in the live broadcast interface, wherein the first interaction prompt information is sent by a server side after receiving a first interaction request of a first user side;
and responding to the confirmation operation aiming at the first interaction prompt message, and sending a first confirmation request to the server side so that the server side can establish an interaction connection channel between the anchor side and the first user side, and indicating the first user side to collect a third video picture.
18. The method of claim 1, wherein providing the second region in the live interface comprises:
and providing a second area in the live broadcast interface, and displaying the first area in the second area in an overlapping manner.
19. An information display method, comprising:
a first user end provides a first area and a first interactive control in a live broadcast interface, and displays a first video picture in the first area;
responding to the triggering operation aiming at the first interactive control, establishing an interactive connection channel with a main broadcasting end through a server end, and providing a second area in the live broadcasting interface;
collecting a third video picture and sending the third video picture to the server;
displaying trial prompt information corresponding to at least one product associated with the live broadcast room in the live broadcast interface;
responding to a trigger operation aiming at any trial prompt information, sending a trial request to a server based on a target product corresponding to the any trial prompt information so that the server can identify a target object corresponding to the target product in the third video picture, and fusing a product effect of the target product into the third video picture based on the target object to obtain a second video picture;
displaying the second video picture in the second area.
20. An information display method, comprising:
the anchor terminal provides a first area and second interactive controls corresponding to different users in a live broadcast interface, and displays a first video picture in the first area;
responding to the triggering operation aiming at any second interactive control, establishing an interactive connection channel corresponding to any second interactive control and a first user side through a server side, and providing a second area in the live broadcast interface;
displaying trial prompt information corresponding to at least one product associated with the live broadcast room in the live broadcast interface;
responding to a trigger operation aiming at any trial prompt information, sending a trial request to a server based on a target product corresponding to the any trial prompt information so that the server can identify a target object corresponding to the target product in a third video picture acquired by a first user side, and fusing a product effect of the target product into the third video picture based on the target object to obtain a second video picture;
displaying the second video picture in the second area.
21. An information display method, comprising:
the second user side obtains a first video picture and a second video picture provided by the server side; the first video picture is acquired by a main broadcasting terminal; the second video picture is obtained by identifying a target object from a third video picture and fusing a product effect of a target product associated with the live broadcast room into the third video picture based on the target object; the third video picture is acquired by the first user terminal;
providing a first area and a second area in a live broadcast interface;
displaying the first video picture in the first area and displaying the second video picture in the second area.
22. An information processing method characterized by comprising:
establishing an interactive connection channel between a first user terminal and a main broadcasting terminal;
acquiring a first video picture acquired by the anchor terminal;
acquiring a third video picture acquired by the first user side, and identifying a target object in the third video picture;
based on the target object, fusing the product effect of the target product associated with the live broadcast room into the third video picture to obtain a second video picture;
and providing the first video picture and the second video picture to a second user end so as to enable the second user end to display the first video picture and the second video picture.
23. The method of claim 22, wherein establishing the interactive connection channel between the first user terminal and the anchor terminal comprises:
receiving a first interaction request sent by a first user end, sending first interaction prompt information to a main broadcast end so that the main broadcast end can output the first interaction prompt information, and generating a first confirmation request based on confirmation operation aiming at the first interaction prompt information;
and receiving the first confirmation request sent by the anchor terminal, and establishing an interactive connection channel between the first user terminal and the anchor terminal.
24. The method of claim 22, wherein establishing the interactive connection channel between the first user terminal and the anchor terminal comprises:
receiving a second interaction request sent by a main broadcasting end, sending second interaction prompt information to a corresponding first user end so that the first user end can output the second interaction prompt information, and generating a second confirmation request based on confirmation operation aiming at the second interaction prompt information;
and receiving the second confirmation request sent by the first user terminal, and establishing an interactive connection channel between the first user terminal and the anchor terminal.
25. The method of claim 22, wherein the identifying a target object feature in the third video frame comprises:
sending trial prompt information of at least one product to the first user side so that the first user side can output the trial prompt information of the at least one product, and sending a trial request aiming at a target product corresponding to any one of the trial prompt information based on the triggering operation of any one of the trial prompt information;
receiving the trial request sent by the first user terminal;
and identifying a target object corresponding to the target product in the third video picture.
26. The method of claim 22, wherein the identifying the target object in the third video picture comprises:
sending trial prompt information of at least one product to a broadcaster end so that the broadcaster end can output the trial prompt information of the at least one product, and sending a trial request aiming at a target product corresponding to any one of the trial prompt information based on the trigger operation of any one of the trial prompt information;
receiving the trial request sent by the anchor terminal;
and identifying a target object corresponding to the target product in the third video picture.
27. The method of claim 22, further comprising:
and providing the first video picture and the second video picture to the first user terminal and the anchor terminal so that the anchor terminal and the first user terminal can display the first video picture and the second video picture.
28. The method of claim 22, further comprising:
providing the operation control corresponding to the target product to the second user end so that the second user end can display the operation control of the target product;
and receiving a processing request which is sent by the second user side and triggered based on the operation control, and carrying out corresponding processing on the target product.
29. An information display method, comprising:
a first user end provides a first area and a microphone connecting control in a live broadcast interface, and displays a first video picture in the first area;
responding to the triggering operation aiming at the microphone connecting control, establishing a microphone connecting interaction channel with a main broadcasting end through a server end, and providing a second area in the live broadcasting interface;
collecting a third video picture and sending the third video picture to the server;
displaying makeup test prompt information corresponding to at least one makeup product associated with the live broadcast room in the live broadcast interface;
responding to a trigger operation aiming at any makeup trying prompt message, sending a makeup trying request to a server based on a target makeup product corresponding to any makeup trying prompt message so that the server can identify a face feature part corresponding to the target makeup product in the third video picture, and fusing a makeup effect image of the target makeup product to the third video picture based on the face feature part to obtain a makeup trying picture;
displaying the makeup trial picture in the second area.
30. An information display method, comprising:
the method comprises the steps that a main broadcast end provides a first area and microphone connecting controls corresponding to different users in a live broadcast interface, and a first video picture is displayed in the first area;
responding to the triggering operation aiming at any one microphone connecting control, establishing a microphone connecting interaction channel corresponding to the first user side of the any microphone connecting control through a server side, and providing a second area in the live broadcast interface;
displaying makeup test prompt information corresponding to at least one makeup product associated with the live broadcast room in the live broadcast interface;
responding to a trigger operation aiming at any makeup test prompt message, sending a makeup test request to a server based on a target makeup product corresponding to any makeup test prompt message so that the server can identify a face feature part corresponding to the target makeup product in a third video picture acquired by a first user side, and fusing a makeup effect image of the target makeup product into the third video picture based on the face feature part to obtain a makeup test picture;
displaying the makeup trial picture in the second area.
31. An information display method, comprising:
the second user side obtains a first video picture and a makeup trial picture provided by the server side; the first video picture is acquired by a main broadcasting terminal; the makeup trial picture is obtained by identifying a face characteristic part from a third video picture and fusing a makeup effect picture of a target makeup product associated with the live broadcast room into the third video picture based on the face characteristic part; the third video picture is acquired by the first user terminal;
providing a first area and a second area in a live broadcast interface;
displaying the first video picture in the first area, and displaying the makeup trial picture in the second area.
32. An information processing method characterized by comprising:
establishing a connecting wheat interaction channel between a first user terminal and a main broadcasting terminal;
acquiring a first video picture acquired by the anchor terminal;
acquiring a third video image acquired by the first user side, and identifying a face characteristic part in the third video image;
based on the face feature part, fusing a makeup effect picture of a target makeup product associated with a live broadcast room into the third video picture to obtain a makeup trial picture;
and providing the first video picture and the makeup trial picture to a second user end so that the second user end can display the first video picture and the makeup trial picture.
33. A network live broadcast system is characterized by comprising a server, a main broadcast end, a first user end and a second user end;
the server is used for establishing an interactive connection channel between the anchor terminal and the first user terminal; acquiring a first video picture acquired by the anchor terminal; acquiring a third video picture acquired by the first user side, and identifying a target object in the third video picture; based on the target object, fusing the product effect of the target product associated with the live broadcast room into the third video picture to obtain a second video picture;
the first user side is used for acquiring a third video picture after an interactive connection channel with the anchor side is established, and sending the third video picture to the server side; acquiring the first video picture and the second video picture from the server side, and displaying the first video picture and the second video picture;
the second user side is used for acquiring the first video picture and the second video picture from the server side and displaying the first video picture and the second video picture.
34. An information display device characterized by comprising:
the first display module is used for providing a first area in a live interface of a live broadcast room; displaying a first video picture in the first area; the first video picture is acquired by a main broadcasting terminal;
the second display module is used for providing a second area in the live broadcast interface; displaying a second video picture in the second area; the second video picture is obtained by identifying a target object from a third video picture and fusing a product effect of a target product associated with the live broadcast room into the third video picture based on the target object; and the third video picture is acquired by the first user terminal.
35. An information display device characterized by comprising:
the third display module is used for providing a first area and a first interactive control in a live broadcast interface and displaying a first video picture in the first area;
the first response module is used for responding to the triggering operation aiming at the first interactive control, establishing an interactive connection channel with a main broadcast end through a server end, and providing a second area in the live broadcast interface;
the first acquisition module is used for acquiring a third video picture and sending the third video picture to the server;
the fourth display module is used for displaying trial prompt information corresponding to at least one product associated with the live broadcast room in the live broadcast interface; responding to a trigger operation aiming at any trial prompt information, sending a trial request to a server based on a target product corresponding to the any trial prompt information so that the server can identify a target object corresponding to the target product in the third video picture, and fusing a product effect of the target product into the third video picture based on the target object to obtain a second video picture; displaying the second video picture in the second area.
36. An information display device characterized by comprising:
the fifth display module is used for providing a first area and second interactive controls corresponding to different users in a live broadcast interface and displaying a first video picture in the first area;
the second response module is used for responding to the triggering operation aiming at any second interactive control, establishing an interactive connection channel corresponding to a first user end of any second interactive control through the server end, and providing a second area in the live broadcast interface;
the sixth display module is used for displaying trial prompt information corresponding to at least one product associated with the live broadcast room in the live broadcast interface; responding to a trigger operation aiming at any trial prompt information, sending a trial request to a server based on a target product corresponding to the any trial prompt information so that the server can identify a target object corresponding to the target product in a third video picture acquired by a first user side, and fusing a product effect of the target product into the third video picture based on the target object to obtain a second video picture; displaying the second video picture in the second area.
37. An information display device characterized by comprising:
the first acquisition module is used for acquiring a first video picture and a second video picture provided by the server; the first video picture is acquired by a main broadcasting terminal; the second video picture is obtained by identifying a target object from a third video picture and fusing a product effect of a target product associated with the live broadcast room into the third video picture based on the target object; the third video picture is acquired by the first user terminal;
the seventh display module is used for providing a first area and a second area in a live broadcast interface; displaying the first video picture in the first area and displaying the second video picture in the second area.
38. An information processing apparatus characterized by comprising:
the interactive triggering module is used for establishing an interactive connection channel between the first user terminal and the anchor terminal;
the second acquisition module is used for acquiring a first video picture acquired by the anchor terminal; acquiring a third video picture acquired by the first user side;
the identification module is used for identifying a target object in the third video picture;
the fusion module is used for fusing the product effect of the target product associated with the live broadcast room into the third video picture to obtain a second video picture based on the target object;
and the providing module is used for providing the first video picture and the second video picture to a second user end so as to enable the second user end to display the first video picture and the second video picture.
39. A live broadcast interface is characterized by comprising a first area and a second area;
the first area is used for displaying a first video picture; the first video picture is acquired by a main broadcasting terminal;
the second area is used for displaying a second video picture; the second video picture is obtained by identifying a target object from a third video picture and fusing a product effect of a target product associated with the live broadcast room into the third video picture based on the target object position; and the third video picture is acquired by the first user terminal.
40. An electronic device is characterized by comprising a storage component, a display component and a processing component; the storage component stores one or more computer program instructions; the one or more computer program instructions for invocation and execution by the processing component to implement the information display method of any of claims 1-18.
41. A computing device comprising a processing component and a storage component;
the storage component stores one or more computer instructions; the one or more computer instructions for execution by the processing component to perform the information processing method of claim 22.
42. A computer storage medium storing a computer program which, when executed by a computer, implements an information display method according to any one of claims 1 to 18.
43. A computer storage medium characterized by storing a computer program that realizes the information processing method according to claim 22 when executed by a computer.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010340430.3A CN113301412B (en) | 2020-04-26 | 2020-04-26 | Information display method, information processing method, device and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010340430.3A CN113301412B (en) | 2020-04-26 | 2020-04-26 | Information display method, information processing method, device and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113301412A true CN113301412A (en) | 2021-08-24 |
CN113301412B CN113301412B (en) | 2023-04-18 |
Family
ID=77317975
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010340430.3A Active CN113301412B (en) | 2020-04-26 | 2020-04-26 | Information display method, information processing method, device and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113301412B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023030079A1 (en) * | 2021-09-06 | 2023-03-09 | 北京字跳网络技术有限公司 | Article display method and apparatus, and electronic device and storage medium |
CN117544795A (en) * | 2023-11-03 | 2024-02-09 | 书行科技(北京)有限公司 | Live broadcast information display method, management method, device, equipment and medium |
WO2024099335A1 (en) * | 2022-11-10 | 2024-05-16 | 北京字跳网络技术有限公司 | Livestream method and apparatus, device and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120054797A1 (en) * | 2010-08-27 | 2012-03-01 | Telefonaktiebolaget Lm Ericsson (Publ) | Methods and apparatus for providing electronic program guides |
CN106682958A (en) * | 2016-11-21 | 2017-05-17 | 汕头市智美科技有限公司 | Method and device for trying on makeup virtually |
CN106791904A (en) * | 2016-12-29 | 2017-05-31 | 广州华多网络科技有限公司 | Live purchase method and device |
CN109874021A (en) * | 2017-12-04 | 2019-06-11 | 腾讯科技(深圳)有限公司 | Living broadcast interactive method, apparatus and system |
-
2020
- 2020-04-26 CN CN202010340430.3A patent/CN113301412B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120054797A1 (en) * | 2010-08-27 | 2012-03-01 | Telefonaktiebolaget Lm Ericsson (Publ) | Methods and apparatus for providing electronic program guides |
CN106682958A (en) * | 2016-11-21 | 2017-05-17 | 汕头市智美科技有限公司 | Method and device for trying on makeup virtually |
CN106791904A (en) * | 2016-12-29 | 2017-05-31 | 广州华多网络科技有限公司 | Live purchase method and device |
CN109874021A (en) * | 2017-12-04 | 2019-06-11 | 腾讯科技(深圳)有限公司 | Living broadcast interactive method, apparatus and system |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023030079A1 (en) * | 2021-09-06 | 2023-03-09 | 北京字跳网络技术有限公司 | Article display method and apparatus, and electronic device and storage medium |
CN115776575A (en) * | 2021-09-06 | 2023-03-10 | 北京字跳网络技术有限公司 | Article display method and device, electronic equipment and storage medium |
CN115776575B (en) * | 2021-09-06 | 2024-07-23 | 北京字跳网络技术有限公司 | Method and device for displaying article, electronic equipment and storage medium |
WO2024099335A1 (en) * | 2022-11-10 | 2024-05-16 | 北京字跳网络技术有限公司 | Livestream method and apparatus, device and storage medium |
CN117544795A (en) * | 2023-11-03 | 2024-02-09 | 书行科技(北京)有限公司 | Live broadcast information display method, management method, device, equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN113301412B (en) | 2023-04-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113301412B (en) | Information display method, information processing method, device and system | |
CN105701217B (en) | Information processing method and server | |
CN106792228B (en) | Live broadcast interaction method and system | |
CN108391153B (en) | Virtual gift display method and device and electronic equipment | |
CN111405343A (en) | Live broadcast interaction method and device, electronic equipment and storage medium | |
CN113965811B (en) | Play control method and device, storage medium and electronic device | |
CN107911737B (en) | Media content display method and device, computing equipment and storage medium | |
CN109525850A (en) | A kind of live broadcasting method, apparatus and system | |
CN105828123A (en) | Method and apparatus for interaction in live broadcast | |
WO2012109666A1 (en) | Contextual commerce for viewers of video programming | |
JP2006005897A (en) | Terminal device, content distribution system, information output method, information output program | |
US9781492B2 (en) | Systems and methods for making video discoverable | |
US11871154B2 (en) | Video distribution server, video distribution method and recording medium | |
US20170131851A1 (en) | Integrated media display and content integration system | |
CN111107434A (en) | Information recommendation method and device | |
CN111479119A (en) | Method, device and system for collecting feedback information in live broadcast and storage medium | |
CN113301421A (en) | Live broadcast clip display method and device, storage medium and electronic equipment | |
CN103957464A (en) | Advertisement distributing method and system | |
CN113784180A (en) | Video display method, video pushing method, video display device, video pushing device, video display equipment and storage medium | |
CN115174953B (en) | Event virtual live broadcast method, system and event live broadcast server | |
WO2017047288A1 (en) | Video display system | |
KR20170066287A (en) | System and method for marketing service using augmented reality | |
JP5852171B2 (en) | Content additional information provision system | |
JP2021170788A (en) | Video distribution server, video distribution method, and video distribution program | |
CN115278329B (en) | Video playing method, system and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |