CN113194323A - Information interaction method, multimedia information interaction method and device - Google Patents

Information interaction method, multimedia information interaction method and device Download PDF

Info

Publication number
CN113194323A
CN113194323A CN202110459499.2A CN202110459499A CN113194323A CN 113194323 A CN113194323 A CN 113194323A CN 202110459499 A CN202110459499 A CN 202110459499A CN 113194323 A CN113194323 A CN 113194323A
Authority
CN
China
Prior art keywords
information
analyzed
analysis result
target user
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110459499.2A
Other languages
Chinese (zh)
Other versions
CN113194323B (en
Inventor
窦洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koubei Shanghai Information Technology Co Ltd
Original Assignee
Koubei Shanghai Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koubei Shanghai Information Technology Co Ltd filed Critical Koubei Shanghai Information Technology Co Ltd
Priority to CN202110459499.2A priority Critical patent/CN113194323B/en
Publication of CN113194323A publication Critical patent/CN113194323A/en
Application granted granted Critical
Publication of CN113194323B publication Critical patent/CN113194323B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application discloses an information interaction method, which comprises the following steps: the system comprises a first end and a second end, wherein the first end and the second end at least perform video information interaction; the first end corresponds to at least one target user, the second end corresponds to at least one service provider, and the first end comprises an information acquisition window for acquiring information to be analyzed of the target user; and the first end acquires the information to be analyzed of the target user and then sends the information to the second end, or the first end acquires the information to be analyzed of the target user, then analyzes the information to be analyzed to acquire an analysis result and sends the analysis result to the second end. The analysis result based on the information to be analyzed is obtained through analysis processing, so that the analysis result of the information to be analyzed, which is aimed at the target user, obtained by the first end or the second end is more targeted, namely the interaction between the live broadcast server and the target user is increased, and meanwhile, the accuracy of the live broadcast server in terms of the analysis result provided by the information to be analyzed is improved.

Description

Information interaction method, multimedia information interaction method and device
Technical Field
The application relates to a live broadcast technology, in particular to an information interaction method, an information interaction device, a multimedia information interaction method, electronic equipment and a computer storage medium, as well as other information interaction methods and information interaction devices.
Background
Live broadcast is a new form of information dissemination and is commonly used. In the live broadcast service, live broadcast content is published to a live broadcast platform, and users accessing the live broadcast platform through a special live broadcast client, a browser and the like can watch the live broadcast content, so that real-time information transmission is formed.
In the client for live broadcasting provided by the prior art, the display area of the client is limited, so that the display area of the client is mainly used for presenting live broadcasting content. Particularly in the medical and beauty live broadcast, the anchor is mostly a passive response to the questions of the user, and no active interaction mode with the user exists, so that the interaction between the user and the anchor is reduced. Moreover, the existing interaction between the user and the anchor is only voice or text interaction, and the answer of the anchor to the question of the user is relatively general, so that the user cannot obtain accurate medical and aesthetic consultation information from the interaction with the anchor.
Therefore, how to increase the interaction interest between the anchor and the user and improve the accuracy of the anchor providing the medical and aesthetic service information becomes a problem to be solved by those skilled in the art.
Disclosure of Invention
The embodiment of the application provides an information interaction method, which is used for solving the problem in the prior art that how to increase the interaction interest of a main broadcast and a user and improve the accuracy of medical and aesthetic service information provided by the main broadcast. The embodiment of the application also provides an information interaction device, other information interaction methods and information interaction devices, a multimedia information interaction method, electronic equipment and a computer storage medium.
The embodiment of the application provides an information interaction method, which comprises the following steps: the system comprises a first end and a second end which can interact with instant messages, wherein the first end and the second end at least carry out video information interaction; the first end corresponds to at least one target user, the second end corresponds to at least one service provider, and the first end comprises an information acquisition window for acquiring information to be analyzed of the target user; and the first end acquires the information to be analyzed of the target user and then sends the information to the second end, or the first end acquires the information to be analyzed of the target user, then analyzes the information to be analyzed to acquire an analysis result and sends the analysis result to the second end.
Optionally, after the first end obtains the information to be analyzed of the target user and sends the information to the second end, the second end analyzes the information to be analyzed to obtain an analysis result, and the second end presents the analysis result to the service provider.
Optionally, the information obtaining window is an image recognition window, and is configured to obtain at least facial information, limb information, and hair style information of the target user.
Optionally, the service provider is a live service provider.
Optionally, the method further includes: and the second end or the first end initiates an information acquisition request to be analyzed, an information acquisition program is started after the first end or the second end agrees, and the first end acquires the information to be analyzed of the target user through the information acquisition window.
Optionally, the obtaining, by the first end, the to-be-analyzed information of the target user through the information obtaining window includes: the information acquisition program is started, and an image template used for acquiring the information to be analyzed of the target user is displayed in the information acquisition window; correspondingly obtaining the image information of the target user in the image template through the image obtaining device at the first end; and taking the image information as the information to be analyzed of the target user.
Optionally, the image template includes at least one of the following templates: a face image template, a limb image template and a hair style image template; correspondingly, the image information at least comprises one of the following information: face information, limb information, and hair style information.
Optionally, the method further includes: the second end obtains a trigger operation of the service provider for an analysis result of the information to be analyzed of the target user; selecting a target analysis result of the target user from the analysis results according to the trigger operation; and sending the target analysis result to the first end, wherein the first end presents the target analysis result of the target user.
Optionally, the analyzing the information to be analyzed by the first end or the second end to obtain an analysis result, including: and the first end or the second end obtains the characteristic information of the information to be analyzed, and corresponding analysis results are screened out from a preset analysis result database according to the characteristic information.
Optionally, the screening out the corresponding analysis result from the preset data analysis result database according to the feature information includes: obtaining at least one dimension characteristic information of the characteristic information; obtaining a dimension characteristic mark corresponding to each analysis result in the preset analysis result database; and matching the dimension characteristic information with the dimension characteristic identification corresponding to each analysis result to obtain a target analysis result, and taking the target analysis result as a corresponding analysis result screened from a preset analysis result database.
Optionally, the dimension characteristic information at least includes one of the following information: skin characteristic information, skin visual effect characteristic information, color spot coordinate and quantity information, and pore coordinate and quantity information.
Optionally, the method further includes: and after the second end analyzes the information to be analyzed to obtain an analysis result, obtaining associated object information associated with the analysis result according to the analysis result, and presenting the associated object information to the service provider by the second end.
Optionally, the method further includes: and the second end sends the associated object information to the first end, and the first end presents the associated object information to the target user.
Optionally, the associated object information at least includes case information or product link information.
Optionally, the method further includes: a third terminal capable of performing instant message interaction with the first terminal and the second terminal, wherein the third terminal corresponds to at least one other user; the third terminal obtains the information to be analyzed of other users and then sends the information to the second terminal, the second terminal analyzes the information to be analyzed to obtain an analysis result, and the second terminal presents the analysis result to the service provider; or after the third end obtains the information to be analyzed of the other users, the third end obtains an analysis result according to the information to be analyzed, the analysis result is sent to the second end, and the second end presents the analysis result to the service provider.
Optionally, the method further includes: the second end obtains the trigger operation of the analysis result of the service provider aiming at the information to be analyzed of the other users; selecting target analysis results of other users from the analysis results of the other users according to the trigger operation; and sending the target analysis result to the first end, wherein the first end presents the target analysis results of the other users.
The embodiment of the application provides an information interaction method, which comprises the following steps: the first end can perform instant message interaction with a second end corresponding to at least one service provider, and the first end corresponds to at least one target user; the first end comprises an information acquisition window for acquiring information to be analyzed of the target user; and the first end acquires the information to be analyzed of the target user and sends the information to the second end, or after acquiring the information to be analyzed of the target user, the first end acquires an analysis result according to the information to be analyzed and sends the analysis result to the second end.
Optionally, the method further includes: the first end initiates an information acquisition request to be analyzed, an information acquisition program is started after the first end agrees, and the first end acquires the information to be analyzed of the target user through the information acquisition window.
Optionally, the obtaining, by the first end, the to-be-analyzed information of the target user through the information obtaining window includes: the information acquisition program is started, and an image template used for acquiring the information to be analyzed of the target user is displayed in the information acquisition window; correspondingly obtaining the image information of the target user in the image template through the image obtaining device at the first end; and taking the image information as the information to be analyzed of the target user.
Optionally, the image template includes at least one of the following templates: a face image template, a limb image template and a hair style image template; correspondingly, the image information at least comprises one of the following information: face information, limb information, and hair style information.
Optionally, the analyzing the information to be analyzed by the first end to obtain an analysis result, including: the first end obtains the characteristic information of the information to be analyzed, and corresponding analysis results are screened out from a preset analysis result database according to the characteristic information.
Optionally, the screening out the corresponding analysis result from the preset data analysis result database according to the feature information includes: obtaining at least one dimension characteristic information of the characteristic information; obtaining a dimension characteristic mark corresponding to each analysis result in the preset analysis result database; and matching the dimension characteristic information with the dimension characteristic identification corresponding to each analysis result to obtain a target analysis result, and taking the target analysis result as a corresponding analysis result screened from a preset analysis result database.
Optionally, the dimension characteristic information at least includes one of the following information: skin characteristic information, skin visual effect characteristic information, color spot coordinate and quantity information, and pore coordinate and quantity information.
The embodiment of the application provides an information interaction method, which comprises the following steps: the second end can perform instant message interaction with the first end corresponding to at least one target user, and the second end corresponds to at least one service provider; the second end obtains the information to be analyzed of the target user sent by the first end, or the second end obtains the analysis result sent by the first end.
Optionally, after the second end obtains the information to be analyzed of the target user sent by the first end, the second end analyzes the information to be analyzed to obtain an analysis result, and the second end presents the analysis result to the service provider.
Optionally, the analyzing the information to be analyzed by the second end to obtain an analysis result, including: and the second end obtains the characteristic information of the information to be analyzed, and screens out a corresponding analysis result from a preset analysis result database according to the characteristic information.
Optionally, the screening out the corresponding analysis result from the preset data analysis result database according to the feature information includes: obtaining at least one dimension characteristic information of the characteristic information; obtaining a dimension characteristic mark corresponding to each analysis result in the preset analysis result database; and matching the dimension characteristic information with the dimension characteristic identification corresponding to each analysis result to obtain a target analysis result, and taking the target analysis result as a corresponding analysis result screened from a preset analysis result database.
Optionally, the dimension characteristic information at least includes one of the following information: skin characteristic information, skin visual effect characteristic information, color spot coordinate and quantity information, and pore coordinate and quantity information.
Optionally, the method further includes: the second end obtains a trigger operation of the service provider for an analysis result of the information to be analyzed of the target user; selecting a target analysis result of the target user from the analysis results according to the trigger operation; and sending the target analysis result to the first end.
Optionally, the method further includes: and after the second end analyzes the information to be analyzed to obtain an analysis result, obtaining associated object information associated with the analysis result according to the analysis result, and presenting the associated object information to the service provider by the second end.
Optionally, the method further includes: and the second end sends the associated object information to the first end.
Optionally, the associated object information at least includes case information or product link information.
Optionally, the method further includes: a third terminal capable of performing instant message interaction with the second terminal, wherein the third terminal corresponds to at least one other user; the third terminal obtains the information to be analyzed of other users and then sends the information to the second terminal, the second terminal analyzes the information to be analyzed of other users to obtain an analysis result, and the second terminal presents the analysis result to the service provider; or after the third end obtains the information to be analyzed of the other users, the third end obtains an analysis result according to the information to be analyzed, the analysis result is sent to the second end, and the second end presents the analysis result to the service provider.
Optionally, the method further includes: the second end obtains the trigger operation of the analysis result of the service provider aiming at the information to be analyzed of the other users; selecting target analysis results of other users from the analysis results according to the trigger operation; and sending the target analysis result to the first end.
The embodiment of the application provides a multimedia information interaction method, which is used in a multimedia information interaction process and comprises the following steps: initiating an information interaction request through a first end capable of initiating an interaction request; receiving an information interaction request initiated by the first end through at least one second end; responding to the second end to confirm that the operation request of the information interaction request is accepted, and triggering an image recognition window at the second end, wherein the image recognition window recognizes the image of the second end and acquires the required information to be analyzed; analyzing the information to be analyzed and obtaining an analysis result; and sending the analysis result to at least the first end.
Optionally, the sending the analysis result to at least the first end includes: and sending the analysis result to the first end, and simultaneously sending the analysis result to the second end.
An embodiment of the present application provides an information interaction apparatus, including: the information processing unit is used for the first end to obtain the information to be analyzed of the target user and then send the information to the second end, or after the first end obtains the information to be analyzed of the target user, the information to be analyzed is analyzed to obtain an analysis result, and the analysis result is sent to the second end; the first end and the second end are a first end and a second end which can interact with instant messages, and the first end and the second end at least carry out video information interaction; the first end corresponds to at least one target user, the second end corresponds to at least one service provider, and the first end comprises an information acquisition window for acquiring information to be analyzed of the target user.
An embodiment of the present application provides an information interaction apparatus, including: the information processing unit is used for the first end to obtain information to be analyzed of a target user and send the information to the second end, or after the first end obtains the information to be analyzed of the target user, the first end obtains an analysis result according to the information to be analyzed and sends the analysis result to the second end; the first end can perform instant message interaction with a second end corresponding to at least one service provider, and the first end corresponds to at least one target user; the first end comprises an information acquisition window for acquiring the information to be analyzed of the target user.
An embodiment of the present application provides an information interaction apparatus, including: the information processing unit is used for a second end to obtain the information to be analyzed of the target user sent by a first end, or the second end to obtain the analysis result sent by the first end; the second end is a second end capable of performing instant message interaction with a first end corresponding to at least one target user, and the second end corresponds to at least one service provider.
The embodiment of the application provides an information interaction method, which comprises the following steps: acquiring a first request message which is sent by an anchor end associated with a target user end and used for requesting to acquire to-be-analyzed information of a target user, wherein the target user end is a user end corresponding to the target user; aiming at the first request message, obtaining information to be analyzed of the target user; sending the information to be analyzed to the anchor terminal; and obtaining an analysis result of the information to be analyzed, which is returned by the anchor terminal and aims at the target user.
Optionally, the obtaining, for the first request message, information to be analyzed of the target user includes: loading a first information request area to be analyzed on a playing page of the target user side aiming at the first request message, wherein a prompt sign for acquiring information to be analyzed of the target user is presented in the first information request area to be analyzed; obtaining a first trigger operation aiming at the first information request area to be analyzed; and obtaining the information to be analyzed of the target user according to the first trigger operation.
Optionally, the obtaining information to be analyzed of the target user according to the first trigger operation includes: displaying an image template for obtaining the image information of the target user in the first information request area to be analyzed aiming at the first trigger operation; correspondingly obtaining the image information of the target user in the image template through an image obtaining device; and taking the image information as the information to be analyzed of the target user.
Optionally, the image template includes at least one of the following templates: a face image template, a limb image template and a hair style image template; correspondingly, the image information at least comprises one of the following image information: face information, limb information, and hair style information.
Optionally, the obtaining an analysis result of the information to be analyzed, which is returned by the anchor terminal and is specific to the target user, includes: correspondingly displaying the analysis result in an image template corresponding to the image information of the target user; and/or playing the analysis result in a playing page of the target user side.
Optionally, the correspondingly displaying the analysis result in the image template corresponding to the image information of the target user includes: correspondingly displaying a first part of analysis results of the analysis results in the image template corresponding to the image information of the target user; the first portion of analysis results includes a first scoring score.
Optionally, the playing the analysis result in the playing page of the target user side includes: playing a second part of analysis results of the analysis results in a playing page of the target user side, wherein the second part of analysis results comprise analysis information of each dimension characteristic of information to be analyzed for the target user and suggestion information of the analysis information; the analysis information of each dimension characteristic at least comprises skin characteristic information, skin visual effect characteristic information, color spot coordinates, quantity information, pore coordinates and quantity information.
Optionally, the method further includes: obtaining an analysis result of information to be analyzed, which is returned by the anchor terminal and aims at least one other user; wherein, the other users refer to users using other user sides; and obtaining a target analysis result screened from the analysis results of the information to be analyzed of other users and the analysis results of the information to be analyzed of the target user returned by the anchor terminal.
Optionally, the obtaining an analysis result of the information to be analyzed, which is returned by the anchor terminal and is specific to at least one other user, includes: acquiring at least one second information request area to be analyzed loaded on the playing page of the target user side, wherein a prompt mark for acquiring information to be analyzed of other users is presented in the second information request area to be analyzed; obtaining an image template which is displayed in the second information request area to be analyzed and used for obtaining the image information of the other users; correspondingly displaying the analysis results aiming at the other users in the image template corresponding to the image information of the other users, and/or playing the analysis results aiming at the other users in a playing page of the target user side.
Optionally, the correspondingly presenting, in the image template corresponding to the image information of the other user, the analysis result for the other user includes: correspondingly displaying a first analysis result of the analysis results of the other users in the image template corresponding to the image information of the other users, wherein the first analysis result comprises a second scoring score.
Optionally, the method includes: the playing of the analysis results for the other users in the playing page of the target user side includes: playing a second part of analysis results of the analysis results for the other users in a playing page of the target user side, wherein the second part of analysis results comprise analysis information of each dimension characteristic of information to be analyzed for the other users and suggestion information of the analysis information; the analysis information of each dimension characteristic at least comprises one of the following information: skin characteristic information, skin visual effect characteristic information, color spot coordinates, quantity information, pore coordinates and quantity information.
Optionally, the method further includes: obtaining associated object information which is returned by the anchor terminal and is associated with the information to be analyzed and aiming at the target user; and/or obtaining associated object information which is returned by the anchor terminal and is associated with information to be analyzed and is aimed at least one other user; and obtaining the user corresponding to the target analysis result returned by the anchor terminal.
Optionally, the method further includes: sending a second request message for requesting to acquire case information associated with the information to be analyzed to the anchor terminal, wherein the second request message comprises a feature identifier associated with the information to be analyzed; and obtaining case information which is returned by the anchor terminal aiming at the second request message and is associated with the information to be analyzed.
Optionally, the obtaining of the case information, which is returned by the anchor terminal for the second request message and is associated with the information to be analyzed, includes: loading a floating layer on a playing page of the target user side, wherein case information related to the information to be analyzed is loaded in the floating layer; and obtaining a second trigger operation aiming at the floating layer, and selecting case information associated with the information to be analyzed in the floating layer according to the second trigger operation.
The embodiment of the present application further provides an information interaction method, including: sending a first request message for requesting to acquire information to be analyzed of a target user to a target user side associated with an anchor side, wherein the target user side is a user side corresponding to the target user; obtaining information to be analyzed of the target user returned by the target user side; obtaining an analysis result of information to be analyzed for the target user; and sending the analysis result to the target user side.
Optionally, the sending a first request message for requesting to acquire information to be analyzed of a target user to a target user side associated with the anchor side includes: obtaining a first instruction module which contains a first request message on a control page of the anchor terminal; and acquiring a third trigger operation aiming at the first instruction module, and sending a first request message for requesting to acquire the to-be-analyzed information of the target user to a target user side associated with the anchor terminal according to the third trigger operation.
Optionally, the obtaining an analysis result of the information to be analyzed for the target user includes: and obtaining the characteristic information of the information to be analyzed, and screening out a corresponding analysis result from a preset analysis result database according to the characteristic information.
Optionally, the screening out the corresponding analysis result from the preset data analysis result database according to the feature information includes: obtaining at least one dimension characteristic information of the characteristic information; obtaining a dimension characteristic mark corresponding to each analysis result in the preset analysis result database; and matching the dimension characteristic information with the dimension characteristic identification corresponding to each analysis result to obtain a target analysis result, and taking the target analysis result as a corresponding analysis result screened from a preset analysis result database.
Optionally, the dimension characteristic information at least includes one of the following information: skin characteristic information, skin visual effect characteristic information, color spot coordinate and quantity information, and pore coordinate and quantity information.
Optionally, the method further includes: sending a third request message for requesting to acquire to-be-analyzed information of other users to at least one other user side associated with the anchor side, wherein the at least one other user side is a user side corresponding to the at least one other user; obtaining the information to be analyzed of the other users returned by the other users; obtaining an analysis result of the information to be analyzed for the other users; and sending the analysis result to the target user side and/or other user sides.
Optionally, the sending a third request message for requesting to acquire to-be-analyzed information of another user to another user side associated with the anchor side includes: a third instruction module which contains a third request message on the control page of the anchor terminal is obtained; and obtaining a fourth trigger operation aiming at the third instruction module, and sending a third request message for requesting to acquire the information to be analyzed of other users to other user sides associated with the anchor side according to the fourth trigger operation.
Optionally, the obtaining an analysis result of the information to be analyzed for the other user includes: obtaining image information of the information to be analyzed of the other users; obtaining target dimension characteristic information of the image information; and screening an analysis result matched with the target dimension characteristic information from a preset analysis result information base according to the target dimension characteristic information.
Optionally, the method further includes: target analysis results screened from analysis results of information to be analyzed of other users and analysis results of information to be analyzed of target users; the sending the analysis result to the target user side and/or other user sides includes: and sending the screened target analysis result to the target user side and/or other user sides.
Optionally, the target analysis result screened from the analysis results of the information to be analyzed of the other users and the analysis results of the information to be analyzed of the target user includes: obtaining a first scoring score of an analysis result of the target user; obtaining a second scoring score of the analysis results of the other users; and comparing the scores of the first dimension feature score and the second dimension feature score, and taking the analysis result corresponding to the high score as a target analysis result.
Optionally, the method further includes: obtaining associated object information associated with the information to be analyzed for the target user; and/or obtaining associated object information associated with information to be analyzed for at least one other user; determining a user associated with the target analysis result.
Optionally, the method further includes: obtaining a second request message sent by the target user side and used for requesting to obtain the case information associated with the information to be analyzed; obtaining case information associated with the information to be analyzed aiming at the second request message; and sending the case information associated with the information to be analyzed to the target user side.
Optionally, the obtaining, for the second request message, case information associated with the information to be analyzed includes: obtaining a feature identifier which is included in the second request message and is associated with the information to be analyzed; acquiring case marks of each case information in a preset case information base; matching the characteristic identifier associated with the information to be analyzed with each case mark, and matching target case information corresponding to the case mark associated with the characteristic identifier associated with the information to be analyzed from the preset case information base; and taking the target case information as case information associated with the information to be analyzed.
An embodiment of the present application further provides an information interaction apparatus, including: a request message obtaining unit, configured to obtain a first request message, which is sent by an anchor associated with a target user side and is used for requesting to obtain information to be analyzed of a target user, where the target user side is a user side corresponding to the target user; an information to be analyzed obtaining unit, configured to obtain, for the first request message, information to be analyzed of the target user; the information to be analyzed sending unit is used for sending the information to be analyzed to the anchor terminal; and the analysis result obtaining unit is used for obtaining an analysis result of the information to be analyzed, which is returned by the anchor terminal and aims at the target user.
An embodiment of the present application further provides an information interaction apparatus, including: a request message sending unit, configured to send a first request message for requesting to acquire information to be analyzed of a target user to a target user side associated with an anchor side, where the target user side is a user side corresponding to the target user; the information to be analyzed obtaining unit is used for obtaining the information to be analyzed of the target user returned by the target user side; an analysis result obtaining unit configured to obtain an analysis result of information to be analyzed for the target user; and the analysis result sending unit is used for sending the analysis result to the target user side.
The embodiment of the present application further provides a multimedia information interaction method, which is used in a multimedia information interaction process, and includes: initiating an information interaction request through a first end capable of initiating an interaction request; at least one second end receives an information interaction request initiated by the first end; responding to the second end to confirm the operation request for accepting the information interaction request, and obtaining the trigger aiming at the image recognition window of the second end, wherein the image recognition window recognizes the image of the second end and obtains the required information to be analyzed; analyzing the information to be analyzed and obtaining an analysis result; and sending the analysis result to at least the first end.
Optionally, the sending the analysis result to at least the first end includes: and sending the analysis result to the first end, and simultaneously sending the analysis result to the second end.
Optionally, the obtaining of the trigger for the image recognition window of the second end, where the image recognition window recognizes the image of the second end and obtains the required information to be analyzed, includes: displaying an image template for obtaining the information to be analyzed in the image recognition window according to the triggering of the image recognition window aiming at the second end; correspondingly obtaining image information of the image in the image template through an image obtaining device; and taking the image information as the information to be analyzed.
Optionally, the image template includes at least one of the following templates: a face image template, a limb image template and a hair style image template; correspondingly, the image information at least comprises one of the following image information: face information, limb information, and hair style information.
Optionally, the analyzing the information to be analyzed and obtaining an analysis result includes: obtaining target dimension characteristic information of the image information; and screening an analysis result matched with the target dimension characteristic information from a preset analysis result information base according to the target dimension characteristic information.
Optionally, the obtaining of the target dimension characteristic information of the image information includes: obtaining standard dimension characteristic information of the image template information; obtaining dimension characteristic information corresponding to the image in the image information; and matching the dimension characteristic information with the standard dimension characteristic information to obtain target dimension characteristic information of the image information.
Optionally, the screening, according to the target dimension characteristic information, an analysis result matched with the target dimension characteristic information from a preset analysis result information base includes: obtaining the associated identification of each analysis result in the preset analysis result information base; matching the target dimension characteristic information with the associated identification to obtain a target analysis result matched with the target dimension characteristic information; and taking the target analysis result as an analysis result matched with the target dimension characteristic information.
The embodiment of the present application further provides a multimedia information interaction apparatus, which is used in a multimedia information interaction process, and includes: the information interaction request sending unit is used for initiating an information interaction request through a first end capable of initiating an interaction request; the information interaction request receiving unit is used for receiving an information interaction request initiated by the first end through at least one second end; the information to be analyzed acquisition unit is used for responding to the second end to confirm the operation request for receiving the information interaction request and acquiring the trigger of an image recognition window aiming at the second end, and the image recognition window recognizes the image of the second end and acquires the required information to be analyzed; the analysis unit is used for analyzing the information to be analyzed and obtaining an analysis result; and the analysis result sending unit is used for sending the analysis result at least to the first end.
An embodiment of the present application further provides an electronic device, where the electronic device includes: a processor; a memory for storing a computer program for execution by the processor to perform the above described method.
An embodiment of the present application further provides a computer storage medium, where a computer program is stored, and the computer program is executed by a processor to perform the method described above.
Compared with the prior art, the method has the following advantages:
the embodiment of the application provides an information interaction method, which comprises the following steps: acquiring a first request message which is sent by an anchor end associated with a target user end and used for requesting to acquire to-be-analyzed information of a target user, wherein the target user end is a user end corresponding to the target user; aiming at the first request message, obtaining information to be analyzed of the target user; sending the information to be analyzed to the anchor terminal; and obtaining an analysis result of the information to be analyzed, which is returned by the anchor terminal and aims at the target user. The target user side of the embodiment of the application can obtain the first request message which is initiated by the anchor side actively and used for obtaining the information to be analyzed of the target user, obtain the information to be analyzed of the target user in a targeted manner according to the first request message, send the information to be analyzed to the anchor side, the anchor side can analyze and process the information to be analyzed so as to correspondingly obtain the analysis result of the information to be analyzed, the analysis result based on the information to be analyzed is obtained by analysis and processing, so that the target user side can obtain the analysis result which is returned by the anchor side and used for the information to be analyzed of the target user, the analysis result is more targeted, namely the interaction between the anchor and the user is increased, and meanwhile, the accuracy of the analysis result provided by the anchor for the information to be analyzed is improved.
The embodiment of the application provides an information interaction method, which comprises the following steps: the system comprises a first end and a second end which can interact with instant messages, wherein the first end and the second end at least carry out video information interaction; the first end corresponds to at least one target user, the second end corresponds to at least one service provider, and the first end comprises an information acquisition window for acquiring information to be analyzed of the target user; and the first end acquires the information to be analyzed of the target user and then sends the information to the second end, or the first end acquires the information to be analyzed of the target user, then analyzes the information to be analyzed to acquire an analysis result and sends the analysis result to the second end. The analysis result based on the information to be analyzed is obtained through analysis processing, so that the analysis result of the information to be analyzed, which is aimed at the target user, obtained by the first end or the second end is more targeted, namely the interaction between the live broadcast server and the target user is increased, and meanwhile, the accuracy of the live broadcast server for the analysis result provided by the information to be analyzed is improved.
Drawings
Fig. 1 is a schematic diagram of an application scenario provided in a first embodiment of the present application.
Fig. 2 is a flowchart of an information interaction method according to a first embodiment of the present application.
Fig. 3 is a flowchart of an information interaction method according to a second embodiment of the present application.
Fig. 4 is a flowchart of an information interaction method according to a third embodiment of the present application.
Fig. 5 is a schematic diagram of an application scenario provided in the fourth embodiment of the present application.
Fig. 6 is a flowchart of an information interaction method according to a fourth embodiment of the present application.
Fig. 7 is a flowchart of an information interaction method according to a fifth embodiment of the present application.
Fig. 8 is a schematic view of an information interaction device according to a sixth embodiment of the present application.
Fig. 9 is a schematic view of an information interaction apparatus according to a ninth embodiment of the present application.
Fig. 10 is a schematic diagram of an information interaction device according to a tenth embodiment of the present application.
Fig. 11 is a schematic view of an electronic device according to an eleventh embodiment of the present application.
Fig. 12 is a flowchart of a multimedia information interaction method according to a thirteenth embodiment of the present application.
Fig. 13 is a schematic diagram of an application scenario provided in the thirteenth embodiment of the present application.
Fig. 14 is a schematic view of a multimedia information interaction apparatus according to a fourteenth embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth to provide a thorough understanding of embodiments of the present application. The embodiments of this application are capable of embodiments in many different forms than those described herein and can be similarly generalized by those skilled in the art without departing from the spirit and scope of the embodiments of this application and, therefore, the embodiments of this application are not limited to the specific embodiments disclosed below.
In order to enable those skilled in the art to better understand the solution of the present application, a detailed description is given below on a specific application scenario of an embodiment of the present application based on the information interaction method provided by the present application, as shown in fig. 1, which is a schematic diagram of the application scenario provided by a first embodiment of the present application.
Specifically, the application scene is a medical and beauty live broadcast scene, the medical and beauty live broadcast scene comprises a first end corresponding to a target user, the first end can be a mobile phone or a tablet personal computer, and the target user is an initiator of interactive information; and the second end corresponds to at least one service provider, the second end can be a mobile phone or a tablet computer, and the service provider is a receiver of the interactive information.
In this scenario, in order to increase the interest of the target user in interacting with the service provider, the service provider may interact with the target user according to the information interaction request initiated by the first end. Specifically, after the second end confirms that the information interaction request is received, the interaction interface of the first end displays an information acquisition window for acquiring the information to be analyzed of the target user, an image template for acquiring the information to be analyzed is displayed in the information acquisition window, the image information of the target user is correspondingly acquired in the image template through an image acquisition device, and the image information is used as the information to be analyzed. And the first end acquires the information to be analyzed of the target user and then sends the information to the second end, or the first end can send the information to be analyzed to a third-party server for processing. Specifically, the first end or the second end obtains feature information of information to be analyzed, obtains at least one dimension feature information of the feature information, obtains a dimension feature identifier corresponding to each analysis result in the preset analysis result database, matches the dimension feature information with the dimension feature identifier corresponding to each analysis result to obtain a target analysis result, screens out a corresponding analysis result from the preset analysis result database, and presents the analysis result on an information acquisition window of the first end.
In this scenario, in order to simplify the content on the playing page of the first end, the analysis result displayed in the information acquisition window on the interactive interface of the first end is only a partial analysis result, specifically, a first score. The target user can obtain an interaction result through an intuitive scoring score. Certainly, in this scenario, all contents of the analysis result may be displayed on the interactive interface of the first end, and the display manner may be text or voice, that is, all contents of the analysis result are displayed in text information, or all contents of the analysis result are played in a voice playing manner.
In the scene, the interaction with the second end is actively initiated through the first end, the first end or the second end obtains an analysis result after analyzing the information to be analyzed, so that a target user corresponding to the first end and a service provider corresponding to the second end can analyze related images in an interaction process not only in an interaction mode of text or video chat, and the accuracy of the obtained analysis result is higher while the interaction interest is increased, so that the user experience is improved.
It should be noted that the application scenario is only one embodiment of the application scenario, and the embodiment of the application scenario is provided to facilitate understanding of the multimedia information interaction method of the present application, and is not used to limit the multimedia information interaction method of the present application.
Hereinafter, the information interaction method and two other information interaction methods provided by the present application are described in detail by different embodiments, respectively.
Fig. 2 is a flowchart of an information interaction method according to a first embodiment of the present application. The method is applied to the user side, and particularly to a target user side used by a target user. The method comprises the following steps.
Step S101: the first end obtains the information to be analyzed of the target user and then sends the information to the second end, or the first end obtains the information to be analyzed of the target user, then analyzes the information to be analyzed to obtain an analysis result, and sends the analysis result to the second end.
In this step, the first end and the second end are the first end and the second end capable of instant message interaction, and the first end and the second end at least perform video message interaction. The first end and the second end can be clients used by any user, and the clients specifically comprise a mobile phone, a tablet computer and the like. In the method, the first end corresponds to at least one target user, the second end corresponds to at least one service provider, and the service provider is a live broadcast service provider. When a target user corresponding to the first end wants to perform information interaction with a service provider corresponding to the second end, the target user needs to initiate an information interaction request through the first end. Specifically, the first end obtains a first trigger operation of the target user for the information interaction request, and sends the information interaction request to the second end according to the first trigger operation.
In addition, the first end comprises an information acquisition window for acquiring the information to be analyzed of the target user. The information acquisition window can be directly presented on the interactive interface of the first end when the first end and the second end carry out information interaction, or can be presented on the interactive interface of the first end after the first end sends an information interaction request to the second end. The information to be analyzed includes medical and aesthetic information that the target user can provide. The information acquisition window is an image recognition window and is used for at least acquiring the facial information, the limb information and the hair style information of the target user.
The information acquisition window based on the first end is used for acquiring the information to be analyzed of the target user in a language, in order to acquire the information to be analyzed of the target user, a second end or the first end is required to initiate an information acquisition request to be analyzed, an information acquisition program is started after the first end or the second end agrees, and the first end acquires the information to be analyzed of the target user through the information acquisition window.
The first end acquires the information to be analyzed of the target user through the information acquisition window, and the method comprises the following steps: and starting the information acquisition program, and displaying an image template for acquiring the information to be analyzed of the target user in the information acquisition window. Wherein the image template at least comprises one of the following templates: a face image template, a limb image template and a hair style image template; correspondingly, the image information at least comprises one of the following information: face information, limb information, and hair style information. And then, correspondingly obtaining the image information of the target user in the image template through the image obtaining device at the first end, and taking the image information as the information to be analyzed of the target user. Corresponding to the image template, the image information obtained by the image obtaining device in the image template at least comprises one of the following image information: face information, limb information, and hair style information.
It should be noted that, when the image template displayed in the information obtaining window is specifically which image template, the image information obtained by the target user end is also corresponding to the image template. Taking the example that the information to be analyzed is facial feature information, when a facial image template is shown in the information acquisition window, the image information obtained at the first end is facial information of the target user, i.e., a facial image, an eye image, a blackhead image, a mottle image, and the like, and coordinate information and number information of the above-mentioned images in the facial image, and the like.
In this step, the image obtaining device may be a front-facing camera of the first end, the obtained image information of the target user may be a dynamic image including at least one frame of static picture, and the obtained image information of the target user may be more accurate due to the dynamic images of multiple frames. And then the accuracy of the subsequently obtained analysis result of the information to be analyzed of the target user is improved.
In the first embodiment of the present application, after the first end obtains the information to be analyzed of the target user, the information to be analyzed of the target user may be sent to the second end, the second end analyzes the information to be analyzed to obtain an analysis result, and the second end presents the analysis result to the service provider. Or after the first end obtains the information to be analyzed of the target user, the information to be analyzed is analyzed to obtain an analysis result, and the analysis result is sent to the second end. The following explanation will be made specifically based on the same or similar manner in which the first terminal and the second terminal process information to be analyzed.
The first end or the second end analyzes the information to be analyzed to obtain an analysis result, and the analysis result comprises the following steps: specifically, the first end or the second end performs imaging processing on the information to be analyzed to obtain image information corresponding to the information to be analyzed, and obtains each piece of feature information in the image according to an image processing technology, wherein the feature information at least comprises skin features, eye features, nose features, blackhead features, color spot features and the like. And then, screening out a corresponding analysis result from a preset analysis result database according to the characteristic information. Specifically, at least one dimension feature information of the feature information is obtained, and the dimension feature information is information of the feature information in different dimensions. For example, the feature information is skin features, and the corresponding dimension feature information may be skin feature information, skin visual effect feature information, and the like; for another example, the feature information is color spot feature information, and the corresponding dimension feature information may be color spot coordinate information, color spot number information, or the like; for another example, the feature information is pore feature information, and the corresponding dimension feature information may be pore coordinate information, pore number information, or the like. And then, obtaining the dimension characteristic identification corresponding to each analysis result in the preset analysis result database, wherein in the step, each analysis result is correspondingly obtained according to historical data, namely, a related analysis result is correspondingly obtained through historical dimension characteristic information, and each analysis result is stored in the preset analysis result database. In order to facilitate the subsequent use of the results, each analysis result corresponds to the relevant dimension feature identification, so that the corresponding dimension feature information is matched with the analysis result. And finally, matching the dimension characteristic information with the dimension characteristic identification corresponding to each analysis result to obtain a target analysis result, and taking the target analysis result as a corresponding analysis result screened from a preset analysis result database.
After the first end obtains the analysis results, the analysis results may be sent to the second end. And after the second end obtains the analysis result, the second end presents the analysis result to the service provider. In the first embodiment of the present application, in consideration of a large number of analysis results, in order to simplify interaction between the service provider and the target user, the service provider may specifically select the target analysis result again from the numerous analysis results and push the target analysis result to the first end, so that the target user corresponding to the first end may obtain the target analysis result. Specifically, the second end obtains a trigger operation of the service provider for an analysis result of the information to be analyzed of the target user, selects a target analysis result of the target user from the analysis results according to the trigger operation, and sends the target analysis result to the first end, and the first end presents the target analysis result of the target user.
In the first embodiment of the present application, the analysis result may be a score corresponding to the information to be analyzed, a text content or a voice content corresponding to the analysis result of the information to be analyzed, and the like. The first end and the second end respectively display the analysis result, wherein the first end display the analysis result comprises the following two modes: in a first mode, the analysis result is correspondingly displayed in the image template in the information acquisition window at the first end, specifically, a first part of the analysis result is correspondingly displayed in the image template, wherein the first part of the analysis result includes a first score. And playing the analysis result in the interactive interface of the first end, specifically, playing a second part of the analysis result in the interactive interface of the first end, where the second part of the analysis result includes each dimension feature information of the information to be analyzed for the target user and suggestion information for each dimension feature information.
In the first mode, only the first score is displayed in the interactive interface of the first end, so that the target user can directly obtain the evaluation score aiming at the information to be analyzed, and the complexity of the interactive interface of the first end can be simplified. In the second mode, the recommendation information of each dimension characteristic information of the target user is displayed in the interactive interface at the first end, so that the target user can clearly know the specific situation of the target user, and the target user can perform further processing according to the recommendation information. For example, the coordinate and amount of the stain, and the corresponding advice information can be the specific information on how to repair the coordinate and amount of the stain. It should be noted that the two display information manners included in the first end display analysis result may be displayed separately or simultaneously.
In the first embodiment of the present application, the method further includes: and after the second end analyzes the information to be analyzed to obtain an analysis result, obtaining associated object information associated with the analysis result according to the analysis result, and presenting the associated object information to the service provider by the second end. After the service provider sees the associated object information through the second end, the method further comprises the following steps: and the second end sends the associated object information to the first end, and the first end presents the associated object information to the target user. The associated object information at least comprises case information or product link information. Specifically, the second end obtains a trigger operation for the associated object information, sends the associated object information to the first end, and loads a floating layer on a playing page of the first end, wherein case information or product link information associated with the information to be analyzed is loaded in the floating layer.
Certainly, the case information or the product link information obtained by the first end may also be unrelated to the current information to be analyzed, that is, when the service provider broadcasts a specific medical and aesthetic case, the second end may actively push the case information or the product link information corresponding to the medical and aesthetic case to the first end through the triggering operation of the second end, so that the target user using the first end may obtain the case information or the product link information through the first end.
In the first embodiment of the application, the first end and the second end can also interact with third ends corresponding to other users, so that the interaction and the interestingness of the interaction are increased. Specifically, still include: and the third end can be used for carrying out instant message interaction with the first end and the second end, and corresponds to at least one other user. The third terminal obtains the information to be analyzed of other users and then sends the information to the second terminal, the second terminal analyzes the information to be analyzed to obtain an analysis result, and the second terminal presents the analysis result to the service provider; or after the third end obtains the information to be analyzed of the other users, the third end obtains an analysis result according to the information to be analyzed, the analysis result is sent to the second end, and the second end presents the analysis result to the service provider. The third end and the second end analyze the information to be analyzed of other users to obtain corresponding analysis results, and the first end and the second end analyze the information to be analyzed of the target user to obtain corresponding analysis results.
In the first embodiment of the present application, the first terminal and the third terminal may be associated through the second terminal, or the first terminal and the third terminal may be associated through a third party server. Specifically, after the second end obtains the analysis result of the information to be analyzed of the other user, the second end obtains a trigger operation of the service provider for the analysis result of the information to be analyzed of the other user, and sends the analysis result of the other user to the first end according to the trigger operation, and the first end presents the analysis result of the other user.
In the first embodiment of the present application, the analysis result of the other user may be a score of the information to be analyzed corresponding to the other user, a text content or a voice content of the analysis result of the information to be analyzed corresponding to the other user, or the like. The third end and the second end respectively show the analysis results of other users, wherein the first end shows the analysis results of other users in the following two ways: in a first mode, an image template is added to the information acquisition window at the first end, the analysis results of other users are correspondingly shown in the added image template, specifically, a first part of the analysis results of the other users is correspondingly shown in the image template, wherein the first part of the analysis results includes a second score. In a second mode, the analysis results of the other users are played in the interactive interface at the first end, specifically, a second part of the analysis results of the other users is played in the interactive interface at the first end, and the second part of the analysis results includes each dimension feature information of the information to be analyzed of the other users and suggestion information for each dimension feature information.
In the first mode, only the second score is displayed in the interactive interface of the first end, so that the target user can directly obtain the evaluation scores of the information to be analyzed for other users, and the complexity of the interactive interface of the first end can be simplified. In the second mode, the recommendation information of each dimension characteristic information of other users is displayed in the interactive interface at the first end, so that the target user can clearly know the specific situation of the target user through comparison, and the target user can perform further processing according to the recommendation information. It should be noted that the two display information manners included in the first end display analysis result may be displayed separately or simultaneously.
Further, in consideration of a large number of analysis results of other users, the second end obtains a trigger operation of the service provider for the analysis results of the information to be analyzed of the other users, selects a target analysis result of the other user from the analysis results of the other users according to the trigger operation, and sends the target analysis result to the first end, and the first end presents the target analysis result of the other user.
In addition, after the second end analyzes the information to be analyzed of the other users to obtain the analysis result, the second end obtains the associated object information associated with the analysis result of the other users according to the analysis result, and the second end presents the associated object information to the service provider. After the service provider sees the associated object information through the second end, the method further comprises the following steps: and the second end sends the associated object information to the first end, and the first end presents the associated object information to the target user so that the target user can obtain the associated object information of the analysis result corresponding to the information to be analyzed of other users. The associated object information at least comprises case information or product link information.
A first embodiment of the present application provides an information interaction method, including: the system comprises a first end and a second end which can interact with instant messages, wherein the first end and the second end at least carry out video information interaction; the first end corresponds to at least one target user, the second end corresponds to at least one service provider, and the first end comprises an information acquisition window for acquiring information to be analyzed of the target user; and the first end acquires the information to be analyzed of the target user and then sends the information to the second end, or the first end acquires the information to be analyzed of the target user, then analyzes the information to be analyzed to acquire an analysis result and sends the analysis result to the second end. The analysis result based on the information to be analyzed is obtained through analysis processing, so that the analysis result of the information to be analyzed, which is aimed at the target user, obtained by the first end or the second end is more targeted, namely the interaction between the live broadcast server and the target user is increased, and meanwhile, the accuracy of the live broadcast server for the analysis result provided by the information to be analyzed is improved.
Fig. 3 is a flowchart of an information interaction method according to a second embodiment of the present application. The method is applied to a first peer, and in particular to a first peer used by a target user. The method comprises the following steps.
Step 201, a first end obtains information to be analyzed of a target user and sends the information to a second end, or after the first end obtains the information to be analyzed of the target user, an analysis result is obtained according to the information to be analyzed, and the analysis result is sent to the second end.
In this step, the first end and the second end are the first end and the second end capable of instant message interaction, and the first end and the second end at least perform video message interaction. The first end and the second end can be clients used by any user, and the clients specifically comprise a mobile phone, a tablet computer and the like. In the method, the first end corresponds to at least one target user, the second end corresponds to at least one service provider, and the service provider is a live broadcast service provider. When a target user corresponding to the first end wants to perform information interaction with a service provider corresponding to the second end, the target user needs to initiate an information interaction request through the first end. Specifically, the first end obtains a first trigger operation of the target user for the information interaction request, and sends the information interaction request to the second end according to the first trigger operation.
In addition, the first end comprises an information acquisition window for acquiring the information to be analyzed of the target user. The information acquisition window can be directly presented on the interactive interface of the first end when the first end and the second end carry out information interaction, or can be presented on the interactive interface of the first end after the first end sends an information interaction request to the second end. The information acquisition window is an image recognition window and is used for at least acquiring the facial information, the limb information and the hair style information of the target user.
The information acquisition window based on the first end acquires the information to be analyzed of the target user by using a language, in order to acquire the information to be analyzed of the target user, the first end initiates an information acquisition request to be analyzed, an information acquisition program is started after the first end agrees, and the first end acquires the information to be analyzed of the target user through the information acquisition window.
In the second embodiment of the present application, the specific steps of the first end obtaining the information to be analyzed of the target user through the information obtaining window are the same as the specific steps of the first end obtaining the information to be analyzed of the target user through the information obtaining window in the first embodiment of the present application, so that repeated description is omitted here, and for details, reference may be made to the specific description of the first embodiment. In addition, the specific steps of the first end analyzing the information to be analyzed to obtain the analysis result may also refer to the specific description of the first embodiment, and will not be repeated herein.
A second embodiment of the present application provides an information interaction method, including: the first end can perform instant message interaction with a second end corresponding to at least one service provider, and the first end corresponds to at least one target user; the first end comprises an information acquisition window for acquiring information to be analyzed of the target user; and the first end acquires the information to be analyzed of the target user and sends the information to the second end, or after acquiring the information to be analyzed of the target user, the first end acquires an analysis result according to the information to be analyzed and sends the analysis result to the second end. The second embodiment of the application obtains the analysis result based on the information to be analyzed through analysis processing, so that the first end or the second end obtains the analysis result of the information to be analyzed aiming at the target user more specifically, namely, the interaction between the live broadcast server and the target user is increased, and meanwhile, the accuracy of the live broadcast server aiming at the analysis result provided by the information to be analyzed is improved.
Fig. 4 is a flowchart of an information interaction method according to a third embodiment of the present application. The method applies to a second end, and in particular to a second end used by a service provider. The method comprises the following steps.
Step 301, a second end obtains information to be analyzed of a target user sent by a first end, or the second end obtains an analysis result sent by the first end.
In this step, the first end and the second end are the first end and the second end capable of instant message interaction, and the first end and the second end at least perform video message interaction. The first end and the second end can be clients used by any user, and the clients specifically comprise a mobile phone, a tablet computer and the like. In the method, the first end corresponds to at least one target user, the second end corresponds to at least one service provider, and the service provider is a live broadcast service provider.
In the third embodiment of the present application, the information to be analyzed of the target user sent by the first end, and the process of obtaining the information to be analyzed of the target user by the first end may refer to the specific description of the first embodiment, and will not be repeated herein. In addition, the process of obtaining the information to be analyzed of the target user by the first end and analyzing the information to be analyzed of the target user to obtain the analysis result may also refer to the specific description of the first embodiment, and will not be repeated herein. The third embodiment of the present application will specifically describe a process in which the second end obtains information to be analyzed of the target user, and analyzes the information to be analyzed of the target user to obtain an analysis result.
Specifically, the second end performs imaging processing on the information to be analyzed to obtain image information corresponding to the information to be analyzed, and obtains each feature information in the image according to an image processing technology, wherein the feature information at least comprises skin features, eye features, nose features, blackhead features, color spot features and the like. And then, screening out a corresponding analysis result from a preset analysis result database according to the characteristic information. Specifically, at least one dimension feature information of the feature information is obtained, and the dimension feature information is information of the feature information in different dimensions. For example, the feature information is skin features, and the corresponding dimension feature information may be skin feature information, skin visual effect feature information, and the like; for another example, the feature information is color spot feature information, and the corresponding dimension feature information may be color spot coordinate information, color spot number information, or the like; for another example, the feature information is pore feature information, and the corresponding dimension feature information may be pore coordinate information, pore number information, or the like. And then, obtaining dimension characteristic identifiers corresponding to the analysis results in the preset analysis result database, specifically, in this step, each analysis result is correspondingly obtained according to historical data, that is, a related analysis result is correspondingly obtained through historical dimension characteristic information, and each analysis result is stored in the preset analysis result database. In order to facilitate the subsequent use of the results, each analysis result corresponds to the relevant dimension feature identification, so that the corresponding dimension feature information is matched with the analysis result. And finally, matching the dimension characteristic information with the dimension characteristic identification corresponding to each analysis result to obtain a target analysis result, and taking the target analysis result as a corresponding analysis result screened from a preset analysis result database.
And after the second end obtains the analysis result, the second end presents the analysis result to the service provider. In the third embodiment of the present application, in consideration of a large number of analysis results, in order to simplify interaction between the service provider and the target user, the service provider may specifically select the target analysis result again from the numerous analysis results and push the target analysis result to the first end, so that the target user corresponding to the first end may obtain the target analysis result. Specifically, the second end obtains a trigger operation of the service provider for an analysis result of the information to be analyzed of the target user, selects a target analysis result of the target user from the analysis results according to the trigger operation, and sends the target analysis result to the first end, and the first end presents the target analysis result of the target user.
In the third embodiment of the present application, the method further includes: and after the second end analyzes the information to be analyzed to obtain an analysis result, obtaining associated object information associated with the analysis result according to the analysis result, and presenting the associated object information to the service provider by the second end. After the service provider sees the associated object information through the second end, the method further comprises the following steps: and the second end sends the associated object information to the first end, and the first end presents the associated object information to the target user. Specifically, the second end obtains a trigger operation for the associated object information, sends the associated object information to the first end, and loads a floating layer on a playing page of the first end, wherein case information or product link information associated with the information to be analyzed is loaded in the floating layer. Wherein the associated object information at least comprises case information or product link information.
Certainly, the case information or the product link information obtained by the first end may also be unrelated to the current information to be analyzed, that is, when the service provider broadcasts a specific medical and aesthetic case, the second end may actively push the case information or the product link information corresponding to the medical and aesthetic case to the first end through the triggering operation of the second end, so that the target user using the first end may obtain the case information or the product link information through the first end.
In the third embodiment of the present application, the first end and the second end may also interact with third ends corresponding to other users, so as to increase interactivity and interestingness of the interaction. Specifically, still include: and the third end can be used for carrying out instant message interaction with the first end and the second end, and corresponds to at least one other user. The third terminal obtains the information to be analyzed of other users and then sends the information to the second terminal, the second terminal analyzes the information to be analyzed to obtain an analysis result, and the second terminal presents the analysis result to the service provider; or after the third end obtains the information to be analyzed of the other users, the third end obtains an analysis result according to the information to be analyzed, the analysis result is sent to the second end, and the second end presents the analysis result to the service provider. The third end and the second end analyze the information to be analyzed of other users to obtain corresponding analysis results, and the first end and the second end analyze the information to be analyzed of the target user to obtain corresponding analysis results.
In the first embodiment of the present application, the first terminal and the third terminal may be associated through the second terminal, or the first terminal and the third terminal may be associated through a third party server. Specifically, after the second end obtains the analysis result of the information to be analyzed of the other user, the second end obtains a trigger operation of the service provider for the analysis result of the information to be analyzed of the other user, and sends the analysis result of the other user to the first end according to the trigger operation, and the first end presents the analysis result of the other user.
In addition, after the second end analyzes the information to be analyzed of the other users to obtain the analysis result, the second end obtains the associated object information associated with the analysis result of the other users according to the analysis result, and the second end presents the associated object information to the service provider. After the service provider sees the associated object information through the second end, the method further comprises the following steps: and the second end sends the associated object information to the first end, and the first end presents the associated object information to the target user so that the target user can obtain the associated object information of the analysis result corresponding to the information to be analyzed of other users. The associated object information at least comprises case information or product link information.
A third embodiment of the present application provides an information interaction method, including: the second end can perform instant message interaction with the first end corresponding to at least one target user, and the second end corresponds to at least one service provider; the second end obtains the information to be analyzed of the target user sent by the first end, or the second end obtains the analysis result sent by the first end. The analysis result of the information to be analyzed, which is obtained by the second end in the third embodiment of the application, is obtained after analysis processing, so that the analysis result of the information to be analyzed, which is obtained by the second end for the target user, is more targeted, that is, the interaction between the live broadcast server and the target user is increased, and the accuracy of the analysis result provided by the live broadcast server for the information to be analyzed is also improved.
In order to enable those skilled in the art to better understand the scheme of the present application, a detailed description is given below to a specific application scenario of an embodiment of the information interaction method based on the present application, as shown in fig. 5, which is a schematic diagram of the application scenario provided in the fourth embodiment of the present application.
Specifically, the application scene is a medical and beauty live broadcast scene. The medical and American live broadcast is mainly used for providing purchase decision service for users during medical consumption. When the medical and American anchor broadcasts directly, the medical and American anchor needs to enter a live broadcast interface through a live broadcast end which can be operated by the medical and American anchor, the live broadcast interface can establish communication connection with a live broadcast client (including a target user end of a target user and other user ends of other users) application used by a user, so that when the anchor broadcasts directly on the live broadcast interface, the user can watch live broadcast content of the anchor after the live broadcast client used by the user enters a corresponding live broadcast room.
In the scene, in order to increase the interactive interest of the anchor and the user and improve the accuracy of the anchor in providing medical and aesthetic service information. When the doctor-beauty anchor carries out direct doctor-beauty broadcasting, the target user carries out watching interaction at the target user side used by the target user. On the playing page of the target user side, the target user can see the video which is live broadcast by the main broadcast. When the doctor-beauty anchor interface broadcasts a specified doctor-beauty case, the doctor-beauty anchor can trigger the operation of a live broadcast end used by the doctor-beauty anchor interface, specifically, a virtual button on an interface of the live broadcast end is clicked, the live broadcast end receives touch operation to generate a corresponding control instruction, and sends the control instruction to a broadcast page watched by a target user for loading a floating layer, case information corresponding to the doctor-beauty case is loaded in the floating layer, the anchor end can control automatic playing of the case information, the target user can also control playing of the case information, specifically, the target user triggers sinking and floating, the broadcast page (a third broadcast page) of a target user end obtains triggering operation aiming at the floating layer, and case information is selected in the floating layer according to the triggering operation to realize playing.
In addition, the target user may further interact with the anchor according to a first request message initiated by the anchor for requesting to acquire information to be analyzed of the target user, specifically, in this scenario, a first instruction module including the first request message on a control page of the anchor is acquired, a third trigger operation for the first instruction module is acquired, and the first request message for requesting to acquire the information to be analyzed of the target user is sent to a target user side associated with the anchor according to the third trigger operation. The target user side obtains the first request message, and loads the first information request area to be analyzed on the playing page of the target user side, and the target user operates the first information request area to be analyzed to upload the information to be analyzed of the target user. Specifically, after obtaining a first trigger operation for a first information request area to be analyzed, a target user side displays an image template for obtaining image information of a target user in the first information request area to be analyzed, correspondingly obtains image information (a first playing page) of the target user in the image template through an image obtaining device, and takes the image information as the information to be analyzed of the target user. The target user side sends the information to be analyzed to the anchor side, the anchor side obtains image information of the information to be analyzed of the target user, obtains target dimension characteristic information of the image information, screens out an analysis result matched with the target dimension characteristic information from a preset analysis result information base according to the target dimension characteristic information, presents the analysis result on a control page of the anchor side, and sends the analysis result to the target user.
In this scenario, in order to simplify the content on the playing page of the target user side, the analysis result displayed on the playing page of the target user side is only a partial analysis result, specifically, the first score. The target user can obtain the condition of the face diagnosis problem through the intuitive scoring score, and the solution of the face diagnosis problem is explained and understood through the voice of the anchor. Certainly, in this scenario, all contents of the analysis result may be displayed on the playing page of the target user side, and the display manner may be text or voice, that is, all contents of the analysis result are displayed in a text information manner, or all contents of the analysis result are displayed in a voice playing manner.
Further, in this scenario, the anchor terminal may further send a third request message to the other user terminals to request to acquire information to be analyzed of the other users, and after the other user terminals acquire the third request message, the anchor terminal acquires the information to be analyzed of the other users and feeds back the information to be analyzed of the other users to the anchor terminal, and after the anchor terminal acquires the information to be analyzed of the other users, the anchor terminal may acquire an analysis result of the information to be analyzed of the other users and send the analysis result to the target user terminal and/or the other user terminals.
The scene is explained by sending the analysis results of the information to be analyzed of other users to the target user side, that is, at least one second information request area to be analyzed loaded on the playing page of the target user side is obtained, an image template for obtaining the image information of other users displayed in the second information request area to be analyzed is obtained, and the analysis results for other users are correspondingly displayed in the image templates corresponding to the image information of other users. In order to simplify the content on the playing page of the target user side, a first analysis result of the analysis result for the other user is correspondingly displayed in the image template corresponding to the image information of the other user, and the first analysis result comprises a second score.
In addition, the anchor terminal can also send the screened target analysis result to the target user terminal according to the target analysis result screened from the analysis results of the information to be analyzed of other users and the analysis results of the information to be analyzed of the target user. In order to perform an interactive effect, the anchor terminal may obtain an associated object associated with the information to be analyzed according to the information to be analyzed of the target user or the information to be analyzed of other users, determine a user associated with the target analysis result according to the target analysis result, and give the corresponding associated object to the user.
Therefore, the interaction with the user is actively initiated through the anchor terminal, so that the enthusiasm of the user for participating in the interaction can be increased; in addition, the anchor terminal can actively push related case information to the user, or obtain an analysis result after analyzing the information to be analyzed of the user, so that the accuracy of the analysis result obtained by the user is higher, and the user experience is improved.
It should be noted that the application scenario is only one embodiment of the application scenario, and this embodiment of the application scenario is provided to facilitate understanding of the information interaction method of the present application, and is not used to limit the information interaction method of the present application.
Hereinafter, the information interaction method and another information interaction method provided by the present application are described in detail through different embodiments, respectively.
Fig. 6 is a flowchart of an information interaction method according to a fourth embodiment of the present application. The method is applied to the user side, and particularly to a target user side used by a target user. The method comprises the following steps.
Step S501: the method comprises the steps of obtaining a first request message which is sent by an anchor terminal associated with a target user terminal and used for requesting to obtain to-be-analyzed information of the target user.
In this step, the target user side is a user side corresponding to the target user, and the target user is a user who uses the current user side (see the user side playing page in fig. 5) to view live broadcast. The information to be analyzed includes medical and aesthetic information that can be provided by the target user, specifically, facial feature information of the target user, body feature information of the target user, or voice information or text information provided by the target user. The fourth embodiment of the present application is explained mainly taking facial feature information of a target user as an example.
The target user side obtains a first request message for requesting to obtain information to be analyzed of the target user, and the target user side can obtain the information to be analyzed of the target user based on the first request message, which is described in detail in step S502.
Step S502: and aiming at the first request message, obtaining the information to be analyzed of the target user.
In this step, in order to increase the interaction effect between the anchor and the user, the anchor may actively initiate interaction with the user, so that the target user side may obtain a first request message, which is sent by the anchor associated with the target user side and used for requesting to obtain information to be analyzed of the target user, and after the target user side obtains the first request message, a first information request area to be analyzed may be loaded on the playing page of the target user side, where a prompt flag for obtaining information to be analyzed of the target user is present in the first information request area to be analyzed. The prompt mark may be presented in a text manner, such as "magic mirror skin measurement", so that after the target user sees the prompt mark on the playing page of the target user, it can be determined that the first information to be analyzed request area is used for obtaining the information to be analyzed of the target user. Then, a first trigger operation for the first information request to be analyzed is obtained, that is, the target user triggers the first information request to be analyzed, so that the target user side obtains the first trigger operation for the first information request to be analyzed. And finally, acquiring to-be-analyzed information of the target user according to the first trigger operation, specifically, displaying an image template for acquiring the image information of the target user in the first to-be-analyzed information request area according to the first trigger operation, then, correspondingly acquiring the image information of the target user in the image template through an image acquisition device, and taking the image information as the to-be-analyzed information of the target user.
Wherein the image template at least comprises one of the following templates: the face image template, the limb image template and the hair style image template, and correspondingly, the obtained image information at least comprises one of the following image information: face information, limb information, and hair style information. That is, when the image template displayed in the first information request area to be analyzed is specifically the image template, the image information obtained by the target user end corresponds to the image template. Taking the example that the information to be analyzed is facial feature information, when the first information to be analyzed request region shows a facial image template, the image information obtained by the target user side is the facial information of the target user, specifically, a face image, an eye image, a blackhead image, a mottle image, and the like, and the coordinate information and the number information of the image in the facial image.
In this step, the image obtaining device may be a front-facing camera of the target user side, the obtained image information of the target user may be a dynamic image including at least one frame of static image, and the obtained image information of the target user may be more accurate due to the multiple frames of dynamic images. And then the accuracy of the subsequently obtained analysis result of the information to be analyzed of the target user is improved.
Step S503: and sending the information to be analyzed to the anchor terminal.
And after the target user side obtains the information to be analyzed, sending the information to be analyzed to the anchor side.
Step S504: and obtaining an analysis result of the information to be analyzed, which is returned by the anchor terminal and aims at the target user.
In this step, after the anchor receives the information to be analyzed sent by the target user side, the anchor analyzes the information to be analyzed, and returns the analysis result of the information to be analyzed obtained by analysis to the target user side, even if the target user side obtains the analysis result of the information to be analyzed, which is returned by the anchor and is directed to the target user. In this step, obtaining the analysis result of the information to be analyzed for the target user, which is returned by the anchor terminal, includes the following two ways.
In the first mode, the analysis result is correspondingly displayed in the image template corresponding to the image information of the target user, specifically, a first part of the analysis result is correspondingly displayed in the image template corresponding to the image information of the target user, where the first part of the analysis result includes a first score. And playing the analysis result in the playing page of the target user side, specifically, playing a second part of the analysis result in the playing page of the target user side, wherein the second part of the analysis result comprises analysis information of each dimension characteristic of the information to be analyzed of the target user and suggestion information of the analysis information, and the analysis information of each dimension characteristic at least comprises skin characteristic information, skin visual effect characteristic information, color spot coordinates, quantity information, pore coordinates and quantity information.
In the first mode, only the first score is displayed in the playing page of the target user side, so that the target user can directly obtain the evaluation score of the face of the target user side, and the complexity of the playing page of the target user side can be simplified. In the second way, the analysis information of each dimension characteristic of the target user is displayed in the playing page of the target user side, so that the target user can clearly know the specific situation of the target user (such as a face) and the suggestion information corresponding to the analysis information. For example, the coordinate and amount of the stain, and the corresponding advice information can be the specific information on how to repair the coordinate and amount of the stain. It should be noted that the two manners included in the above-mentioned obtaining of the analysis result of the information to be analyzed, which is returned by the anchor terminal and is directed to the target user, may be used separately or together.
In this step, the target user side may further obtain an analysis result of the information to be analyzed, which is returned by the anchor side and is directed to at least one other user, where the other user refers to a user using the other user side. Specifically, at least one second information request area to be analyzed loaded on the playing page of the target user side is obtained, a prompt mark for obtaining information to be analyzed of other users is presented in the second information request area to be analyzed, an image template for obtaining image information of other users, which is displayed in the second information request area to be analyzed, is obtained, analysis results for other users are correspondingly displayed in the image templates corresponding to the image information of other users, and/or the analysis results for other users are played in the playing page of the target user side.
The second information request area to be analyzed is different from the first information request area to be analyzed in that when other users use the target user side, the image information of other users can be displayed in the second information request area to be analyzed through the image acquisition device of the target user side; when other users do not use the target user side, that is, other users use their own user sides, the second information request area to be analyzed can not only display the image information of other users, but also correspondingly display the analysis results of other users.
In this step, the method for displaying the analysis result for the other user in the image template corresponding to the image information of the other user includes: and correspondingly displaying a first analysis result of the analysis results aiming at the other users in the image template corresponding to the image information of the other users, wherein the first analysis result comprises a second scoring score. The target user can visually see the score corresponding to the facial feature information of the target user and the score corresponding to the facial feature information of other users at the target user side used by the target user.
In this step, the playing of the analysis result for other users in the playing page of the target user side specifically includes: playing a second part of analysis results of the analysis results aiming at other users in a playing page of the target user side, wherein the second part of analysis results comprise analysis information of all dimension characteristics of information to be analyzed aiming at other users and suggestion information aiming at the analysis information; the analysis information of each dimension characteristic at least comprises one of the following information: skin characteristic information, skin visual effect characteristic information, color spot coordinates, quantity information, pore coordinates and quantity information.
The first score of the target user and the first score of other users are only displayed in the playing page of the target user side, so that the target user can directly obtain the evaluation score of the face of the target user, and the complexity of the playing page of the target user side can be simplified. The analysis information of each dimension characteristic of the target user is displayed in the playing page of the target user side, so that the target user can clearly know the specific situation of the face of the target user and the specific situations of the faces of other users, and the suggestion information of the analysis information of the corresponding user is obtained. For example, the coordinate and amount of the stain, and the corresponding advice information can be the specific information on how to repair the coordinate and amount of the stain. It should be noted that the two manners included in the above-mentioned obtaining of the analysis result of the information to be analyzed, which is returned by the anchor terminal and is directed to the target user, may be used separately or together.
Further, after the analysis results of the information to be analyzed of the other users and the analysis results of the information to be analyzed of the target user are obtained, the target analysis results screened from the analysis results of the information to be analyzed of the other users and the analysis results of the information to be analyzed of the target user, which are returned by the anchor terminal, can also be obtained. In one case, the information to be analyzed of other users is closer to the information to be analyzed of the target user, and the anchor terminal may automatically select a reasonable analysis result based on the big information or the working experience of the anchor, so as to be used as a reference for the other users and the target user.
Still further, the target user side may further obtain associated object information associated with the information to be analyzed, which is returned by the anchor side and is directed to the target user, and/or obtain associated object information associated with the information to be analyzed, which is returned by the anchor side and is directed to at least one other user. The related object information related to the information to be analyzed of the target user and the related object information related to the information to be analyzed of the other user are link information related to the information to be analyzed, for example, commodity link information for repairing a facial problem. Or the associated object information associated with the information to be analyzed of the target user and the associated object information associated with the information to be analyzed of the user refer to case information associated with the information to be analyzed. When the playing page of the target user side has other users, the related object information which contains the information to be analyzed of other users and is related to the information to be analyzed can be obtained. Of course, when there are other users in the playing page of the target user side, the target user is also presented in the playing page of the target user side, and thus, the target user side can obtain the associated object information associated with the information to be analyzed that not only includes the target user, but also includes the associated object information associated with the information to be analyzed of other users.
In this step, in order to improve the interactivity between the anchor and the user, the target user side may further obtain the user corresponding to the target analysis result returned by the anchor. For example, the evaluation may be performed by a first score of the analysis result of the target user and a second score of the analysis result of the other user, and the user with the highest score is the user corresponding to the target analysis result.
In a fourth embodiment of the present application, the target user side may send a second request message for requesting to acquire the case information associated with the information to be analyzed to the anchor side, and obtain the case information associated with the information to be analyzed, which is returned by the anchor side for the second request message. Specifically, a floating layer is loaded on a playing page of a target user side, case information associated with information to be analyzed is loaded in the floating layer, a second trigger operation for the floating layer is obtained, and the case information associated with the information to be analyzed is selected from the floating layer according to the second trigger operation. The floating layer is also called a mask layer or a mask, and a view with a specific shape (e.g., rectangle or circle) capable of floating in an interface of the client can also have functions of detecting touch (e.g., clicking, sliding) operation and drawing an image.
Certainly, the case information obtained by the target user side may also be unrelated to the current information to be analyzed, that is, when the anchor port broadcasts the specified medical and cosmetic case, the anchor port can actively push the case information corresponding to the medical and cosmetic case to the target user side through the triggering operation of the anchor port, so that the user using the target user side can obtain the case information through the target user side.
A fourth embodiment of the present application provides an information interaction method, including: acquiring a first request message which is sent by an anchor end associated with a target user end and used for requesting to acquire to-be-analyzed information of a target user, wherein the target user end is a user end corresponding to the target user; aiming at the first request message, obtaining information to be analyzed of the target user; sending the information to be analyzed to the anchor terminal; and obtaining an analysis result of the information to be analyzed, which is returned by the anchor terminal and aims at the target user. The target user side of the fourth embodiment of the application can obtain the first request message, which is actively initiated by the anchor side, for obtaining the information to be analyzed of the target user, and obtain the information to be analyzed of the target user in a targeted manner according to the first request message, and then send the information to be analyzed to the anchor side, the anchor side can analyze and process the information to be analyzed to correspondingly obtain an analysis result of the information to be analyzed, the analysis result based on the information to be analyzed is obtained by analysis, so that the target user side can obtain the analysis result, which is returned by the anchor side and is directed to the information to be analyzed of the target user, and the analysis result is more targeted, that is, the interaction between the anchor and the user is increased, and the accuracy of the analysis result provided by the anchor for the information to be analyzed is also improved.
The application also provides another information interaction method. The method is applied to the anchor side. Please refer to fig. 7, which is a flowchart illustrating an information interaction method according to a fifth embodiment of the present application, wherein the method includes the following specific steps.
Step S601: sending a first request message for requesting to acquire information to be analyzed of a target user to a target user side associated with an anchor side.
In this step, the target user side is a user side corresponding to the target user, and the target user is a user who uses the current user side (see the user side playing page of the figure) to watch live broadcast. The information to be analyzed refers to medical and aesthetic information provided by the target user, and specifically may be facial feature information of the target user, or body feature information of the target user, or voice information or text information provided by the target user. The fifth embodiment of the present application is explained mainly taking facial feature information of a target user as an example.
In the step, a first request message for requesting to acquire the information to be analyzed of the target user is actively sent to the target user corresponding to the target user side through the anchor side so as to increase the interaction between the anchor and the user. Specifically, a first instruction module including a first request message on a control page of the anchor terminal is obtained, a third trigger operation for the first instruction module is obtained, and a first request message for requesting to obtain to-be-analyzed information of a target user is sent to a target user terminal associated with the anchor terminal according to the third trigger operation.
Step S602: and obtaining the information to be analyzed of the target user returned by the target user side.
In this step, corresponding to the target user side, the target user side obtains a first request message for requesting to obtain information to be analyzed of the target user, and for a specific step of obtaining the information to be analyzed of the target user by the first request message, reference may be made to the contents of the first embodiment, which is not repeated. After the target user side obtains the information to be analyzed, the information to be analyzed is sent to the anchor side, namely the anchor side obtains the information to be analyzed of the target user returned by the target user side.
Step S603: and obtaining an analysis result of the information to be analyzed for the target user.
Specifically, firstly, image information of the information to be analyzed of the target user is obtained, namely, the information to be analyzed is subjected to imaging processing to obtain image information corresponding to the information to be analyzed, and each characteristic information in the image is obtained according to an image processing technology, wherein the characteristic information at least comprises skin characteristics, eye characteristics, nose characteristics, blackhead characteristics, color spot characteristics and the like. And then, screening out a corresponding analysis result from a preset analysis result database according to the characteristic information. Specifically, at least one dimension feature information of the feature information is obtained, and the dimension feature information is information of the feature information in different dimensions. For example, the feature information is skin features, and the corresponding dimension feature information may be skin feature information, skin visual effect feature information, and the like; for another example, the feature information is color spot feature information, and the corresponding dimension feature information may be color spot coordinate information, color spot number information, or the like; for another example, the feature information is pore feature information, and the corresponding dimension feature information may be pore coordinate information, pore number information, or the like. And then, obtaining the dimension characteristic identification corresponding to each analysis result in the preset analysis result database, wherein in the step, each analysis result is correspondingly obtained according to historical data, namely, a related analysis result is correspondingly obtained through historical dimension characteristic information, and each analysis result is stored in the preset analysis result database. In order to facilitate the subsequent use of the results, each analysis result corresponds to the relevant dimension feature identification, so that the corresponding dimension feature information is matched with the analysis result. And finally, matching the dimension characteristic information with the dimension characteristic identification corresponding to each analysis result to obtain a target analysis result, and taking the target analysis result as a corresponding analysis result screened from a preset analysis result database.
In the fifth embodiment of the present application, in order to further enhance the interaction between the anchor and the user, the anchor may also invite other users to participate in the interaction. Specifically, first, a third request message for requesting to acquire to-be-analyzed information of another user is sent to at least one other user side associated with the anchor side, where the at least one other user side is a user side corresponding to the at least one other user. Specifically, a third instruction module including a third request message on a control page of the anchor is obtained, a fourth trigger operation for the third instruction module is obtained, and a third request message for requesting to obtain to-be-analyzed information of other users is sent to other clients associated with the anchor according to the fourth trigger operation. Then, obtaining the information to be analyzed of other users returned by other user terminals; the method for obtaining the information to be analyzed of the user based on other user terminals is the same as that of the target user terminal, and the steps are not repeated herein.
Then, the analysis result of the information to be analyzed for the other users is obtained, and the specific steps may refer to obtaining the analysis result of the information to be analyzed for the target user, which only replaces the target user with the other user.
Further, after the analysis results of the information to be analyzed of the other users and the analysis results of the information to be analyzed of the target user are obtained, the target analysis results can be screened from the analysis results of the information to be analyzed of the other users and the analysis results of the information to be analyzed of the target user. Specifically, a first score of an analysis result of a target user is obtained, a second score of an analysis result of another user is obtained, the scores of the first dimension feature score and the second dimension feature score are compared, and the analysis result corresponding to the score is used as the target analysis result.
Furthermore, the anchor terminal can also obtain associated object information associated with the information to be analyzed for the target user; and/or obtaining associated object information associated with information to be analyzed for at least one other user; the related object information related to the information to be analyzed of the target user and the related object information related to the information to be analyzed of the other user are link information related to the information to be analyzed, for example, commodity link information for repairing a facial problem. The associated object information associated with the information to be analyzed of the target user and the associated object information associated with the information to be analyzed of the user refer to case information associated with the information to be analyzed.
In the fifth embodiment of the present application, in order to improve interactivity between the anchor and the user, the anchor side may further determine the user associated with the target analysis result. For example, the evaluation may be performed by a first score of the analysis result of the target user and a second score of the analysis result of the other user, and the user with the highest score is the user corresponding to the target analysis result.
In a fifth embodiment of the present application, the method further includes: the anchor terminal can obtain a second request message sent by the target user terminal and used for requesting to obtain the case information associated with the information to be analyzed, and specifically, the second request message includes the feature identifier associated with the information to be analyzed, that is, the feature identifier associated with the information to be analyzed, included in the second request message is obtained, then, the case mark of each case information in the preset case information base is obtained, the feature identifier associated with the information to be analyzed is matched with the case mark, the target case information corresponding to the case mark associated with the feature identifier associated with the information to be analyzed is matched from the preset case information base, and the target case information is used as the case information associated with the information to be analyzed. And finally, sending the case information associated with the information to be analyzed to a target user side.
Certainly, the case information sent by the anchor terminal may also be unrelated to the current information to be analyzed, that is, when the anchor terminal plays a specific medical and aesthetic case, the anchor terminal can actively push the case information corresponding to the medical and aesthetic case to the target user terminal through the triggering operation of the anchor terminal, so that the user using the target user terminal can obtain the case information through the target user terminal.
Step S604: and sending the analysis result to the target user side.
After the analysis result of the information to be analyzed for the target user is obtained, the analysis result of the information to be analyzed for the target user is sent to the target user side. Or the analysis result of the information to be analyzed for the target user can be sent to other user terminals. And after the analysis result of the information to be analyzed for other users is obtained corresponding to the anchor terminal, the analysis result of the information to be analyzed for other users is sent to the target user terminal. And sending the analysis result of the information to be analyzed for other users to other user terminals.
In addition, the sending of the analysis result to the target user side and/or the other user sides corresponding to the target analysis result screened by the anchor side from the analysis result of the information to be analyzed of the other user and the analysis result of the information to be analyzed of the target user includes: and sending the screened target analysis result to the target user side and/or other user sides.
A fifth embodiment of the present application provides an information interaction method, including: sending a first request message for requesting to acquire information to be analyzed of a target user to a target user side associated with an anchor side, wherein the target user side is a user side corresponding to the target user; obtaining information to be analyzed of the target user returned by the target user side; obtaining an analysis result of information to be analyzed for the target user; and sending the analysis result to the target user side. The anchor terminal actively initiates a first request message for acquiring information to be analyzed of a target user, and pointedly acquires the information to be analyzed of the target user sent by a target user side according to the first request message, the anchor terminal analyzes the information to be analyzed to correspondingly acquire an analysis result of the information to be analyzed, and the analysis result based on the information to be analyzed is obtained through analysis, so that the anchor terminal has more pertinence on the analysis result of the information to be analyzed of the target user, that is, the interaction between the anchor terminal and the user is increased, and the accuracy of the analysis result provided by the anchor terminal for the information to be analyzed is also improved.
In the first embodiment of the present application, an information interaction method is provided, and correspondingly, an information interaction apparatus is provided. Fig. 8 is a schematic view of an information interaction apparatus according to a sixth embodiment of the present application. Since the apparatus embodiments are substantially similar to the method embodiments, they are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for relevant points. The device embodiments described below are merely illustrative. A sixth embodiment of the present application provides an information interaction apparatus, including: the information processing unit 701 is used for the first end to obtain the information to be analyzed of the target user and then send the information to the second end, or after the first end obtains the information to be analyzed of the target user, the information to be analyzed is analyzed to obtain an analysis result, and the analysis result is sent to the second end; the first end and the second end are a first end and a second end which can interact with instant messages, and the first end and the second end at least carry out video information interaction; the first end corresponds to at least one target user, the second end corresponds to at least one service provider, and the first end comprises an information acquisition window for acquiring information to be analyzed of the target user.
In the second embodiment of the present application, an information interaction method is provided, and correspondingly, the present application provides an information interaction device. Fig. 8 is a schematic view of an information interaction apparatus according to a seventh embodiment of the present application. Since the apparatus embodiments are substantially similar to the method embodiments, they are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for relevant points. The device embodiments described below are merely illustrative. A seventh embodiment of the present application provides an information interaction apparatus, including: the information processing unit 701 is used for the first end to obtain information to be analyzed of a target user and send the information to the second end, or after the first end obtains the information to be analyzed of the target user, the first end obtains an analysis result according to the information to be analyzed and sends the analysis result to the second end; the first end can perform instant message interaction with a second end corresponding to at least one service provider, and the first end corresponds to at least one target user; the first end comprises an information acquisition window for acquiring the information to be analyzed of the target user.
In the third embodiment of the present application, an information interaction method is provided, and correspondingly, the present application provides an information interaction device. Fig. 8 is a schematic view of an information interaction apparatus according to an eighth embodiment of the present application. Since the apparatus embodiments are substantially similar to the method embodiments, they are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for relevant points. The device embodiments described below are merely illustrative. An eighth embodiment of the present application provides an information interaction apparatus, including: an information processing unit 701, configured to obtain, by a second peer, to-be-analyzed information of a target user sent by a first peer, or obtain, by the second peer, an analysis result sent by the first peer; the second end is a second end capable of performing instant message interaction with a first end corresponding to at least one target user, and the second end corresponds to at least one service provider.
In the fourth embodiment of the present application, an information interaction method is provided, and correspondingly, the present application provides an information interaction device. Fig. 9 is a schematic view of an information interaction apparatus according to a ninth embodiment of the present application. Since the apparatus embodiments are substantially similar to the method embodiments, they are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for relevant points. The device embodiments described below are merely illustrative. A ninth embodiment of the present application provides an information interaction apparatus, including: a request message obtaining unit 801, configured to obtain a first request message, which is sent by an anchor associated with a target user side and is used for requesting to obtain information to be analyzed of a target user, where the target user side is a user side corresponding to the target user; an information to be analyzed obtaining unit 802, configured to obtain, for the first request message, information to be analyzed of the target user; an information to be analyzed sending unit 803, configured to send the information to be analyzed to the anchor terminal; an analysis result obtaining unit 804, configured to obtain an analysis result of the information to be analyzed, which is returned by the anchor terminal and is for the target user.
In the fifth embodiment of the present application, an information interaction method is provided, and correspondingly, the present application provides an information interaction device. Fig. 10 is a schematic view of an information interaction apparatus according to a tenth embodiment of the present application. Since the apparatus embodiments are substantially similar to the method embodiments, they are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for relevant points. The device embodiments described below are merely illustrative. A tenth embodiment of the present application provides an information interaction apparatus, including: a request message sending unit 901, configured to send a first request message for requesting to obtain information to be analyzed of a target user to a target user side associated with an anchor side, where the target user side is a user side corresponding to the target user; an information to be analyzed obtaining unit 902, configured to obtain information to be analyzed of the target user, which is returned by the target user side; an analysis result obtaining unit 903 configured to obtain an analysis result of information to be analyzed for the target user; an analysis result sending unit 904, configured to send the analysis result to the target user side.
The first to fifth embodiments of the present application provide an information interaction method, and the eleventh embodiment of the present application provides an electronic device corresponding to the method of the first to fifth embodiments. Reference is made to fig. 11, which shows a schematic diagram of the electronic device of the present embodiment. An eleventh embodiment of the present application provides an electronic apparatus, including: a processor 1001; the memory 1002 is configured to store a computer program, which is executed by the processor and executes the information interaction method provided in the first to fifth embodiments of the present application.
A twelfth embodiment of the present application provides a computer storage medium corresponding to the method of each of the first to fifth embodiments. A twelfth embodiment of the present application provides a computer storage medium, where a computer program is stored in the computer storage medium, and the computer program is executed by a processor to execute the information interaction method provided in the first to fifth embodiments of the present application.
The application also provides a multimedia information interaction method. The method is used in the multimedia information interaction process. Please refer to fig. 12, which is a flowchart illustrating a multimedia information interaction method according to a thirteenth embodiment of the present application.
Step S1101: the information interaction request is initiated through the first end capable of initiating the interaction request.
In this step, the first end may be a client used by any user, and the client specifically includes a mobile phone, a tablet computer, and the like. When a first user corresponding to a first end wants to perform information interaction with users corresponding to other clients, the first user needs to initiate an information interaction request through the first end. Specifically, the first end obtains a first trigger operation of a first user for the information interaction request, and sends the information interaction request to other clients according to the first trigger operation. In a thirteenth embodiment of the present application, the other client is defined as a second client, and the user corresponding to the second client is a second user.
Step S1102: and at least one second end receives the information interaction request initiated by the first end.
In the thirteenth embodiment of the present application, the first user may perform information interaction with one second user, or may perform information interaction with a plurality of second users, and therefore, a plurality of second ends are provided in this step, that is, this step includes a second end corresponding to at least one second user, and the second end receives an information interaction request initiated by the first end.
Step S1103: and responding to the second end to confirm the operation request for accepting the information interaction request, and obtaining the trigger aiming at the image recognition window of the second end, wherein the image recognition window recognizes the image of the second end and obtains the required information to be analyzed.
In this step, after the second end receives the information interaction request initiated by the first end, the interaction interface corresponding to the second end displays a message frame for confirming acceptance or rejection of the information interaction request, so that a second user corresponding to the second end can perform further operation according to a prompt of the message frame. Specifically, the second user obtains a prompt message for confirming acceptance or rejection of the information interaction request from an interaction interface corresponding to the second end, when the second user performs information interaction with the first user, a message frame corresponding to the accepted information interaction request in the message frame is triggered, and the second end confirms acceptance of the information interaction request according to the triggering operation. And when the second user does not perform information interaction with the first user, triggering a message frame corresponding to the rejected information interaction request in the message frame, and rejecting the information interaction request by the second end according to the triggering operation.
After the second end responds to the operation that the second end confirms that the information interaction request is accepted, the interaction interface of the second end can display the dynamic image which is corresponding to the first user and is associated with the first user, the dynamic image which is associated with the first user is the dynamic image which is obtained through the image obtaining device of the first end, and the dynamic image which is associated with the first user at least comprises the dynamic image of the first user and the dynamic image of the environment where the first user is located. In this step, the dynamic image of the first user is associated for video interaction with a second user corresponding to the second end.
In this step, the interactive interface of the second end may also display an image recognition window, and the image recognition window may be directly displayed on the interactive interface of the second end after receiving the information interaction request according to the confirmation of the second end, and may also display the image recognition window in the interactive interface according to a first trigger operation obtained by the interactive interface of the second end. The image recognition window is used for recognizing the image of the second end and acquiring required information to be analyzed. The image of the second end may include an image of the second user and an image of an environment in which the second user is located. This step is explained mainly with the image of the second user.
Specifically, firstly, an image recognition window is triggered at the second end, and an image template for obtaining information to be analyzed is displayed in the image recognition window, wherein the image template at least comprises one of the following templates: face image template, limb image template, hair style image template. And then, correspondingly obtaining image information in the image template through an image obtaining device, and taking the image information as the information to be analyzed. Wherein, corresponding to the image template, the image information obtained by the image obtaining device in the image template at least comprises one of the following image information: face information, limb information, and hair style information.
Step S1104: and analyzing the information to be analyzed and obtaining an analysis result.
After the information to be analyzed is obtained, the second end can send the information to be analyzed to the first end for analysis, or the information to be analyzed is forwarded to a third-party server through the first end for information analysis, and the third-party server can analyze the information to be analyzed and obtain an analysis result. Specifically, the information to be analyzed is subjected to imaging processing to obtain image information corresponding to the information to be analyzed, and each piece of characteristic information in the image is obtained according to an image processing technology, wherein the characteristic information at least comprises skin characteristics, eye characteristics, nose characteristics, blackhead characteristics, color spot characteristics and the like. And then, screening out a corresponding analysis result from a preset analysis result database according to the characteristic information. Specifically, at least one dimension feature information of the feature information is obtained, and the dimension feature information is information of the feature information in different dimensions. For example, the feature information is skin features, and the corresponding dimension feature information may be skin feature information, skin visual effect feature information, and the like; for another example, the feature information is color spot feature information, and the corresponding dimension feature information may be color spot coordinate information, color spot number information, or the like; for another example, the feature information is pore feature information, and the corresponding dimension feature information may be pore coordinate information, pore number information, or the like. And then, obtaining the dimension characteristic identification corresponding to each analysis result in the preset analysis result database, wherein in the step, each analysis result is correspondingly obtained according to historical data, namely, a related analysis result is correspondingly obtained through historical dimension characteristic information, and each analysis result is stored in the preset analysis result database. In order to facilitate the subsequent use of the results, each analysis result corresponds to the relevant dimension feature identification, so that the corresponding dimension feature information is matched with the analysis result. And finally, matching the dimension characteristic information with the dimension characteristic identification corresponding to each analysis result to obtain a target analysis result, and taking the target analysis result as a corresponding analysis result screened from a preset analysis result database.
Step S1105: sending the analysis result to at least the first end.
After obtaining the analysis results, the analysis results may be sent to the first end. In a thirteenth embodiment of the present application, sending the analysis result to the first end includes: and sending the analysis result to the first end, sending the analysis result to the second end at the same time, and sending the analysis result to the second end at the same time, so that the analysis result can be correspondingly displayed on the image recognition window of the second end, and a second user corresponding to the second end can conveniently know the analysis result, thereby facilitating the information interaction between the first user and the second user corresponding to the second end.
A thirteenth embodiment of the present application provides a multimedia information interaction method, including: initiating an information interaction request through a first end capable of initiating an interaction request; at least one second end receives an information interaction request initiated by the first end; responding to the second end to confirm the operation request for accepting the information interaction request, and obtaining the trigger aiming at the image recognition window of the second end, wherein the image recognition window recognizes the image of the second end and obtains the required information to be analyzed; analyzing the information to be analyzed and obtaining an analysis result; and sending the analysis result to the second end. The thirteenth embodiment of the present application actively initiates the interaction with the second end through the first end, and the second end can obtain the analysis result, so that the first user corresponding to the first end and the second user corresponding to the second end can not only be the interactive interaction mode of text or video chat in the interaction process, but also analyze the image obtained by the second end, so that the interaction interest is increased, and meanwhile, the analysis result can be directly obtained, and the accuracy of the obtained analysis result is higher, thereby improving the user experience.
For further understanding of the multimedia information interaction method provided in the thirteenth embodiment of the present application, the following description will be made with reference to specific application scenarios. Fig. 13 is a schematic diagram of an application scenario provided in the thirteenth embodiment of the present application.
Specifically, the application scenario is an instant messaging scenario, the instant messaging scenario includes a first end corresponding to a first user, the first end may be a mobile phone or a tablet computer, and the first user is an initiator of the interactive information; and the second end corresponds to at least one second user, the second end can be a mobile phone or a tablet computer, and the second user is a receiver of the interactive information.
In this scenario, in order to improve the interest of the interaction between the first user and the second user, the second user may interact with the first user according to the information interaction request initiated by the first end. Specifically, a first user initiates an information interaction request by triggering a first end, and the first end sends the information interaction request to a second end. The second end obtains the information interaction request, and a message box for confirming acceptance or rejection of the information interaction request is displayed on an interaction interface corresponding to the second end, so that a second user corresponding to the second end can perform further operation according to the prompt of the message box. Specifically, the second user obtains a prompt message for confirming acceptance or rejection of the information interaction request from the interaction interface corresponding to the second end, triggers a message frame corresponding to the accepted information interaction request in the message frame, and confirms acceptance of the information interaction request by the second end according to the triggering operation. And then, after the second end confirms that the information interaction request is received, the interaction interface of the second end displays an image identification window, an image template used for obtaining the information to be analyzed is displayed in the image identification window, the image information of the second user is correspondingly obtained in the image template through an image obtaining device, and the image information is used as the information to be analyzed. The second end can send the information to be analyzed to a third-party server for processing, or the information to be analyzed can be processed at the second end. Specifically, target dimension characteristic information of the image information is obtained, an analysis result matched with the target dimension characteristic information is screened out from a preset analysis result information base according to the target dimension characteristic information, the analysis result is displayed on an image recognition window of the second end, and the analysis result is sent to the first end.
In this scenario, in order to simplify the content on the playing page of the first end, the analysis result displayed in the image recognition window on the interactive interface of the first end is only a partial analysis result, specifically, the first score. The first user may obtain the interactive result by an intuitive scoring score. Certainly, in this scenario, all contents of the analysis result may be displayed on the interactive interface of the first end, and the display manner may be text or voice, that is, all contents of the analysis result are displayed in text information, or all contents of the analysis result are displayed in a voice playing manner.
Therefore, the interaction with the second end is actively initiated through the first end, the first end can obtain an analysis result of the second end after analysis of the information to be analyzed, so that a first user corresponding to the first end and a second user corresponding to the second end can analyze related images in an interaction process not only in an interaction mode of text or video chat, and the accuracy of the obtained analysis result is higher while the interaction interest is increased, so that the user experience is improved.
In the thirteenth embodiment of the present application, a multimedia information interaction method is provided, and correspondingly, a multimedia information interaction apparatus is provided. Fig. 14 is a schematic view of a multimedia information interaction device according to a fourteenth embodiment of the present application. Since the apparatus embodiments are substantially similar to the method embodiments, they are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for relevant points. The device embodiments described below are merely illustrative. A fourteenth embodiment of the present application further provides a multimedia information interaction apparatus, which is used in a multimedia information interaction process, and includes: an information interaction request sending unit 1201, configured to initiate an information interaction request through a first end capable of initiating an interaction request; an information interaction request receiving unit 1202, configured to receive, by at least one second end, an information interaction request initiated by the first end; an information to be analyzed obtaining unit 1203, configured to respond to the second end to confirm that the operation request of the information interaction request is accepted, and obtain a trigger for an image recognition window of the second end, where the image recognition window recognizes an image of the second end and obtains required information to be analyzed; an analysis unit 1204, configured to analyze the information to be analyzed and obtain an analysis result; an analysis result sending unit 1205, configured to send the analysis result to at least the first end.
A thirteenth embodiment of the present application provides a multimedia information interaction method, and a fifteenth embodiment of the present application provides an electronic device corresponding to the method of the thirteenth embodiment. Reference is made to fig. 11, which shows a schematic diagram of the electronic device of the present embodiment. A fifteenth embodiment of the present application provides an electronic apparatus, including: a processor 1101; the memory 1102 is used for storing a computer program, which is executed by the processor to perform the multimedia information interaction method provided in the thirteenth embodiment of the present application.
A sixteenth embodiment of the present application provides a computer storage medium corresponding to the method of the thirteenth embodiment. A sixteenth embodiment of the present application provides a computer storage medium, where a computer program is stored, and the computer program is executed by a processor to execute the multimedia information interaction method provided in the thirteenth embodiment of the present application.
Although the present application has been described with reference to the preferred embodiments, it is not intended to limit the present application, and those skilled in the art can make variations and modifications without departing from the spirit and scope of the present application, therefore, the scope of the present application should be determined by the claims that follow.

Claims (13)

1. An information interaction method, comprising:
the system comprises a first end and a second end which can interact with instant messages, wherein the first end and the second end at least carry out video information interaction;
the first end corresponds to at least one target user, the second end corresponds to at least one service provider, and the first end comprises an information acquisition window for acquiring information to be analyzed of the target user;
and the first end acquires the information to be analyzed of the target user and then sends the information to the second end, or the first end acquires the information to be analyzed of the target user, then analyzes the information to be analyzed to acquire an analysis result and sends the analysis result to the second end.
2. The information interaction method according to claim 1, wherein after the first end obtains the information to be analyzed of the target user and sends the information to the second end, the second end analyzes the information to be analyzed to obtain an analysis result, and the second end presents the analysis result to the service provider.
3. The information interaction method according to claim 1, further comprising: and the second end or the first end initiates an information acquisition request to be analyzed, an information acquisition program is started after the first end or the second end agrees, and the first end acquires the information to be analyzed of the target user through the information acquisition window.
4. The information interaction method according to claim 3, wherein the first end acquires the information to be analyzed of the target user through the information acquisition window, and the method comprises the following steps:
the information acquisition program is started, and an image template used for acquiring the information to be analyzed of the target user is displayed in the information acquisition window;
correspondingly obtaining the image information of the target user in the image template through the image obtaining device at the first end;
and taking the image information as the information to be analyzed of the target user.
5. The information interaction method according to claim 2, wherein the first end or the second end analyzes the information to be analyzed to obtain an analysis result, and comprises: and the first end or the second end obtains the characteristic information of the information to be analyzed, and corresponding analysis results are screened out from a preset analysis result database according to the characteristic information.
6. The information interaction method according to claim 5, wherein the screening out corresponding analysis results from a preset data analysis result database according to the feature information comprises:
obtaining at least one dimension characteristic information of the characteristic information;
obtaining a dimension characteristic mark corresponding to each analysis result in the preset analysis result database;
and matching the dimension characteristic information with the dimension characteristic identification corresponding to each analysis result to obtain a target analysis result, and taking the target analysis result as a corresponding analysis result screened from a preset analysis result database.
7. The information interaction method of claim 2, further comprising: and after the second end analyzes the information to be analyzed to obtain an analysis result, obtaining associated object information associated with the analysis result according to the analysis result, and presenting the associated object information to the service provider by the second end.
8. The information interaction method of claim 7, further comprising: and the second end sends the associated object information to the first end, and the first end presents the associated object information to the target user.
9. The information interaction method according to claim 1, further comprising: a third terminal capable of performing instant message interaction with the first terminal and the second terminal, wherein the third terminal corresponds to at least one other user;
the third terminal obtains the information to be analyzed of other users and then sends the information to the second terminal, the second terminal analyzes the information to be analyzed to obtain an analysis result, and the second terminal presents the analysis result to the service provider;
or after the third end obtains the information to be analyzed of the other users, the third end obtains an analysis result according to the information to be analyzed, the analysis result is sent to the second end, and the second end presents the analysis result to the service provider.
10. An information interaction method, comprising:
the first end can perform instant message interaction with a second end corresponding to at least one service provider, and the first end corresponds to at least one target user;
the first end comprises an information acquisition window for acquiring information to be analyzed of the target user;
and the first end acquires the information to be analyzed of the target user and sends the information to the second end, or after acquiring the information to be analyzed of the target user, the first end acquires an analysis result according to the information to be analyzed and sends the analysis result to the second end.
11. An information interaction method, comprising:
the second end can perform instant message interaction with the first end corresponding to at least one target user, and the second end corresponds to at least one service provider;
the second end obtains the information to be analyzed of the target user sent by the first end, or the second end obtains the analysis result sent by the first end.
12. A multimedia information interaction method is used in a multimedia information interaction process, and is characterized by comprising the following steps:
initiating an information interaction request through a first end capable of initiating an interaction request;
receiving an information interaction request initiated by the first end through at least one second end;
responding to the second end to confirm that the operation request of the information interaction request is accepted, and triggering an image recognition window at the second end, wherein the image recognition window recognizes the image of the second end and acquires the required information to be analyzed;
analyzing the information to be analyzed and obtaining an analysis result;
and sending the analysis result to at least the first end.
13. An information interaction apparatus, comprising:
the information processing unit is used for the first end to obtain the information to be analyzed of the target user and then send the information to the second end, or after the first end obtains the information to be analyzed of the target user, the information to be analyzed is analyzed to obtain an analysis result, and the analysis result is sent to the second end;
the first end and the second end are a first end and a second end which can interact with instant messages, and the first end and the second end at least carry out video information interaction;
the first end corresponds to at least one target user, the second end corresponds to at least one service provider, and the first end comprises an information acquisition window for acquiring information to be analyzed of the target user.
CN202110459499.2A 2021-04-27 2021-04-27 Information interaction method, multimedia information interaction method and device Active CN113194323B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110459499.2A CN113194323B (en) 2021-04-27 2021-04-27 Information interaction method, multimedia information interaction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110459499.2A CN113194323B (en) 2021-04-27 2021-04-27 Information interaction method, multimedia information interaction method and device

Publications (2)

Publication Number Publication Date
CN113194323A true CN113194323A (en) 2021-07-30
CN113194323B CN113194323B (en) 2023-11-10

Family

ID=76979514

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110459499.2A Active CN113194323B (en) 2021-04-27 2021-04-27 Information interaction method, multimedia information interaction method and device

Country Status (1)

Country Link
CN (1) CN113194323B (en)

Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1580684A1 (en) * 1998-04-13 2005-09-28 Nevenengineering, Inc. Face recognition from video images
CN101325679A (en) * 2007-06-13 2008-12-17 索尼株式会社 Information processing apparatus, information processing method, program, and recording medium
US20120288168A1 (en) * 2011-05-09 2012-11-15 Telibrahma Convergent Communications Pvt. Ltd. System and a method for enhancing appeareance of a face
WO2012172625A1 (en) * 2011-06-13 2012-12-20 株式会社洛洛.Com Beauty sns system and program
US20150265162A1 (en) * 2012-10-24 2015-09-24 Cathworks Ltd. Automated measurement system and method for coronary artery disease scoring
CN105205479A (en) * 2015-10-28 2015-12-30 小米科技有限责任公司 Human face value evaluation method, device and terminal device
CN105212896A (en) * 2015-08-25 2016-01-06 努比亚技术有限公司 Health analysis method and terminal
CN105407799A (en) * 2013-07-31 2016-03-16 松下电器(美国)知识产权公司 Skin analysis method, skin analysis device, and method for controlling skin analysis device
CN105933536A (en) * 2016-06-15 2016-09-07 孔马波 Health service system based on mobile phone app
US20160300379A1 (en) * 2014-11-05 2016-10-13 Intel Corporation Avatar video apparatus and method
CN106651879A (en) * 2016-12-23 2017-05-10 深圳市拟合科技有限公司 Method and system for extracting nail image
CN206379992U (en) * 2016-10-11 2017-08-04 武汉嫦娥医学抗衰机器人股份有限公司 A kind of skin quality testing and analysis system
US20170319123A1 (en) * 2016-05-06 2017-11-09 The Board Of Trustees Of The Leland Stanford Junior University Systems and Methods for Using Mobile and Wearable Video Capture and Feedback Plat-Forms for Therapy of Mental Disorders
CN107527024A (en) * 2017-08-08 2017-12-29 北京小米移动软件有限公司 Face face value appraisal procedure and device
CN108846676A (en) * 2018-08-02 2018-11-20 平安科技(深圳)有限公司 Biological characteristic assistant payment method, device, computer equipment and storage medium
WO2019024068A1 (en) * 2017-08-04 2019-02-07 Xinova, LLC Systems and methods for detecting emotion in video data
CN109767845A (en) * 2018-12-18 2019-05-17 深圳壹账通智能科技有限公司 Medical follow up method, apparatus, computer equipment and storage medium
CN110096944A (en) * 2019-02-15 2019-08-06 重庆易保全网络科技有限公司 A kind of signature method, system and the terminal device of electronic contract
CN110473621A (en) * 2019-07-24 2019-11-19 上海联影智能医疗科技有限公司 Diagnostic data display methods, computer equipment and storage medium
US20190362134A1 (en) * 2017-02-01 2019-11-28 Lg Household & Health Care Ltd. Makeup evaluation system and operating method thereof
CN110580486A (en) * 2018-06-07 2019-12-17 阿里巴巴集团控股有限公司 Data processing method and device, electronic equipment and readable medium
US20200099960A1 (en) * 2016-12-19 2020-03-26 Guangzhou Huya Information Technology Co., Ltd. Video Stream Based Live Stream Interaction Method And Corresponding Device
CN111259757A (en) * 2020-01-13 2020-06-09 支付宝实验室(新加坡)有限公司 Image-based living body identification method, device and equipment
KR20200107469A (en) * 2019-03-08 2020-09-16 주식회사 에이아이네이션 A method for providing recommendation services of personal makeup styles based on beauty scores
CN112060101A (en) * 2020-05-26 2020-12-11 浙江鸿吉智能控制有限公司 Intelligent nurse robot system
US10881357B1 (en) * 2019-09-18 2021-01-05 Panasonic Avionics Corporation Systems and methods for monitoring the health of vehicle passengers using camera images
CN112472088A (en) * 2020-10-22 2021-03-12 深圳大学 Emotional state evaluation method and device, intelligent terminal and storage medium

Patent Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1580684A1 (en) * 1998-04-13 2005-09-28 Nevenengineering, Inc. Face recognition from video images
CN101325679A (en) * 2007-06-13 2008-12-17 索尼株式会社 Information processing apparatus, information processing method, program, and recording medium
US20120288168A1 (en) * 2011-05-09 2012-11-15 Telibrahma Convergent Communications Pvt. Ltd. System and a method for enhancing appeareance of a face
WO2012172625A1 (en) * 2011-06-13 2012-12-20 株式会社洛洛.Com Beauty sns system and program
US20150265162A1 (en) * 2012-10-24 2015-09-24 Cathworks Ltd. Automated measurement system and method for coronary artery disease scoring
CN105407799A (en) * 2013-07-31 2016-03-16 松下电器(美国)知识产权公司 Skin analysis method, skin analysis device, and method for controlling skin analysis device
US20160300379A1 (en) * 2014-11-05 2016-10-13 Intel Corporation Avatar video apparatus and method
CN105212896A (en) * 2015-08-25 2016-01-06 努比亚技术有限公司 Health analysis method and terminal
CN105205479A (en) * 2015-10-28 2015-12-30 小米科技有限责任公司 Human face value evaluation method, device and terminal device
US20170319123A1 (en) * 2016-05-06 2017-11-09 The Board Of Trustees Of The Leland Stanford Junior University Systems and Methods for Using Mobile and Wearable Video Capture and Feedback Plat-Forms for Therapy of Mental Disorders
CN105933536A (en) * 2016-06-15 2016-09-07 孔马波 Health service system based on mobile phone app
CN206379992U (en) * 2016-10-11 2017-08-04 武汉嫦娥医学抗衰机器人股份有限公司 A kind of skin quality testing and analysis system
US20200099960A1 (en) * 2016-12-19 2020-03-26 Guangzhou Huya Information Technology Co., Ltd. Video Stream Based Live Stream Interaction Method And Corresponding Device
CN106651879A (en) * 2016-12-23 2017-05-10 深圳市拟合科技有限公司 Method and system for extracting nail image
US20190362134A1 (en) * 2017-02-01 2019-11-28 Lg Household & Health Care Ltd. Makeup evaluation system and operating method thereof
WO2019024068A1 (en) * 2017-08-04 2019-02-07 Xinova, LLC Systems and methods for detecting emotion in video data
CN107527024A (en) * 2017-08-08 2017-12-29 北京小米移动软件有限公司 Face face value appraisal procedure and device
CN110580486A (en) * 2018-06-07 2019-12-17 阿里巴巴集团控股有限公司 Data processing method and device, electronic equipment and readable medium
CN108846676A (en) * 2018-08-02 2018-11-20 平安科技(深圳)有限公司 Biological characteristic assistant payment method, device, computer equipment and storage medium
CN109767845A (en) * 2018-12-18 2019-05-17 深圳壹账通智能科技有限公司 Medical follow up method, apparatus, computer equipment and storage medium
CN110096944A (en) * 2019-02-15 2019-08-06 重庆易保全网络科技有限公司 A kind of signature method, system and the terminal device of electronic contract
KR20200107469A (en) * 2019-03-08 2020-09-16 주식회사 에이아이네이션 A method for providing recommendation services of personal makeup styles based on beauty scores
CN110473621A (en) * 2019-07-24 2019-11-19 上海联影智能医疗科技有限公司 Diagnostic data display methods, computer equipment and storage medium
US10881357B1 (en) * 2019-09-18 2021-01-05 Panasonic Avionics Corporation Systems and methods for monitoring the health of vehicle passengers using camera images
CN111259757A (en) * 2020-01-13 2020-06-09 支付宝实验室(新加坡)有限公司 Image-based living body identification method, device and equipment
CN112060101A (en) * 2020-05-26 2020-12-11 浙江鸿吉智能控制有限公司 Intelligent nurse robot system
CN112472088A (en) * 2020-10-22 2021-03-12 深圳大学 Emotional state evaluation method and device, intelligent terminal and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
邓胜利;: "交互式信息服务中的用户体验分析", 图书馆论坛, no. 02 *

Also Published As

Publication number Publication date
CN113194323B (en) 2023-11-10

Similar Documents

Publication Publication Date Title
EP3874350B1 (en) Intelligent agents for managing data associated with three-dimensional objects
CN113168231A (en) Enhanced techniques for tracking movement of real world objects to improve virtual object positioning
CN110662083A (en) Data processing method and device, electronic equipment and storage medium
CN111260545A (en) Method and device for generating image
CN112653902B (en) Speaker recognition method and device and electronic equipment
CN103079092B (en) Obtain the method and apparatus of people information in video
US20220013026A1 (en) Method for video interaction and electronic device
CN110472099B (en) Interactive video generation method and device and storage medium
WO2018133825A1 (en) Method for processing video images in video call, terminal device, server, and storage medium
CN109992237B (en) Intelligent voice equipment control method and device, computer equipment and storage medium
CN110401810B (en) Virtual picture processing method, device and system, electronic equipment and storage medium
CN109862380B (en) Video data processing method, device and server, electronic equipment and storage medium
CN110377574B (en) Picture collaborative processing method and device, storage medium and electronic device
US20170171621A1 (en) Method and Electronic Device for Information Processing
CN112866577B (en) Image processing method and device, computer readable medium and electronic equipment
CN111741321A (en) Live broadcast control method, device, equipment and computer storage medium
US9407864B2 (en) Data processing method and electronic device
CN111835617B (en) User head portrait adjusting method and device and electronic equipment
CN113194323B (en) Information interaction method, multimedia information interaction method and device
US11507749B2 (en) Configuration for providing a service through a mobile language interpretation platform
CN114598913A (en) Multi-user double-recording interaction control method and system
CN111461005A (en) Gesture recognition method and device, computer equipment and storage medium
KR20190062145A (en) Method for providing messaging service and computer readable medium for storing program for executing the method for providing messaging service
US20230196680A1 (en) Terminal apparatus, medium, and method of operating terminal apparatus
WO2024038699A1 (en) Expression processing device, expression processing method, and expression processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant