WO2022017201A1 - Livestream motion sensing project interaction method and apparatus - Google Patents

Livestream motion sensing project interaction method and apparatus Download PDF

Info

Publication number
WO2022017201A1
WO2022017201A1 PCT/CN2021/105588 CN2021105588W WO2022017201A1 WO 2022017201 A1 WO2022017201 A1 WO 2022017201A1 CN 2021105588 W CN2021105588 W CN 2021105588W WO 2022017201 A1 WO2022017201 A1 WO 2022017201A1
Authority
WO
WIPO (PCT)
Prior art keywords
somatosensory
user
information
current user
somatosensory information
Prior art date
Application number
PCT/CN2021/105588
Other languages
French (fr)
Chinese (zh)
Inventor
张晓波
Original Assignee
北京达佳互联信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京达佳互联信息技术有限公司 filed Critical 北京达佳互联信息技术有限公司
Publication of WO2022017201A1 publication Critical patent/WO2022017201A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Definitions

  • the present disclosure relates to the field of computer technologies, and in particular, to a method and device for interacting with live broadcast motion sensing items.
  • live broadcast through various live broadcast applications or watching other people's live broadcast has become one of the ways for people to enjoy leisure and entertainment.
  • how to enhance the interactivity of live broadcast and let users have a better experience has become one of the important research directions.
  • Embodiments of the present disclosure provide a method and device for interacting with live somatosensory items.
  • the technical solutions of the present disclosure are as follows:
  • a method for interacting with a live broadcast somatosensory item is provided, and the method is applied to a client, including:
  • the interactive result is displayed in combination with the live interface to which the current user screen belongs.
  • acquiring an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user includes: acquiring the somatosensory information based on the current user and the somatosensory information of at least one associated user.
  • the scoring result determined after the somatosensory information is scored for the somatosensory action is used as the interaction result.
  • a scoring result determined after somatosensory action scoring is performed based on the somatosensory information of the current user and the somatosensory information of at least one associated user is obtained, and the interaction result includes: The information is compared with at least one preset somatosensory information to determine the target preset somatosensory information that matches the somatosensory information of the current user; the score corresponding to the target preset somatosensory information is used as the scoring result of the current user; The rating result of the associated user sent by the associated user; the rating result of the current user and the rating result of at least one associated user are used as the interaction result.
  • acquiring the interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user includes: acquiring the somatosensory information based on the current user and the somatosensory information of at least one associated user
  • the somatosensory action image obtained after processing is used as the interaction result.
  • acquiring the interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user includes: linking game information between the respective clients of the current user and the associated user Bridge, which transmits user somatosensory information and/or interaction results.
  • the method further includes: performing interactive processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain an interactive result.
  • the processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain the interaction result includes: the somatosensory information of the current user and the transmitted somatosensory information of the associated user.
  • the somatosensory information is processed to determine the item information of the current user and the associated user, wherein the item information includes a scoring result and a somatosensory action image; the item information of the current user and the associated user is taken together as an interaction result.
  • the method further includes: reporting the somatosensory information to a server; correspondingly, acquiring an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user
  • the method includes: acquiring an interaction result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
  • the method further includes: in response to the operation of the current user opening the live room and selecting the somatosensory item duel, sending an opponent matching request to the server; receiving at least one matching feedback from the server that matches the current user Associate users, and interaction start instructions.
  • determining the somatosensory information corresponding to the somatosensory action in the user screen includes: in response to an interaction start instruction sent by the server, collecting data including the somatosensory action of the user.
  • User screen input the user screen into the somatosensory information acquisition model, and obtain the somatosensory information corresponding to the user's somatosensory action output by the somatosensory information acquisition model.
  • the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
  • an interactive device for live broadcast somatosensory items comprising:
  • a somatosensory information determining module configured to determine the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user;
  • an interaction result acquisition module configured to perform acquisition of an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein the associated user is a user matched with the current user;
  • the interactive result display module is configured to display the interactive result in combination with the live interface to which the current user screen belongs.
  • the interactive result acquisition module includes:
  • the first interaction result obtaining unit is configured to execute a scoring result determined after performing a somatosensory action scoring based on the somatosensory information of the current user and the somatosensory information of at least one of the associated users, as the interaction result.
  • the first interaction result obtaining unit is further configured to execute:
  • the rating result of the current user and the rating result of at least one of the associated users are used as the interaction result.
  • the interactive result acquisition module is also configured to execute:
  • a somatosensory action image obtained by processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user is acquired as the interaction result.
  • the interactive result acquisition module is also configured to execute:
  • the user somatosensory information and/or the interaction result is transmitted through the game information link bridge between the respective clients of the current user and the associated user.
  • the interactive result acquisition module further includes:
  • the second interaction result obtaining unit is configured to perform interaction processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user, so as to obtain an interaction result.
  • the second interaction result obtaining unit is further configured to execute:
  • the item information of the current user and the associated user is taken together as an interaction result.
  • the interactive device for live broadcast of somatosensory items further includes:
  • an information reporting module configured to perform reporting of the somatosensory information to the server
  • the interactive result acquisition module is configured to execute:
  • the interactive device for live broadcast of somatosensory items further includes:
  • the request sending module is configured to perform the operation of opening the live room and selecting the somatosensory item duel in response to the current user, and send an opponent matching request to the server;
  • the instruction receiving module is configured to receive at least one associated user that is fed back by the server and matched with the current user, and an interaction start instruction.
  • the somatosensory information determination module includes:
  • a user image acquisition unit configured to execute an interaction start instruction sent by the server to collect user images including user somatosensory actions
  • the somatosensory information acquisition unit is configured to execute inputting the user screen into a somatosensory information acquisition model, and acquire somatosensory information corresponding to the user's somatosensory action output from the somatosensory information acquisition model.
  • the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
  • an electronic device comprising: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to execute the instructions to achieve the following Step: determine the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user;
  • the interactive result is displayed in combination with the live interface to which the current user screen belongs.
  • the processor is configured to further implement the following step when executing the instruction: after obtaining a somatosensory action score based on the somatosensory information of the current user and the somatosensory information of at least one associated user The determined scoring result is used as the interaction result.
  • the processor is configured to further implement the following steps when executing the instruction: compare the somatosensory information of the current user with at least one preset somatosensory information, and determine the somatosensory information of the current user The matching target preset somatosensory information; take the score corresponding to the target preset somatosensory information as the scoring result of the current user; receive the scoring result of the associated user sent by at least one of the associated users; The rating result, and the rating result of at least one of the associated users are used as the interaction result.
  • the processor is configured to further implement the following step when executing the instruction: acquiring a somatosensory action obtained after processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user image, as a result of said interaction.
  • the processor is configured to further implement the following steps when executing the instructions: through the game information link bridge between the respective clients of the current user and the associated user, perform user somatosensory information and /or the transmission of interactive results.
  • the processor is configured to further implement the following step when executing the instruction: perform interactive processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain Interactive results.
  • the processor is configured to further implement the following steps when executing the instruction: process the somatosensory information of the current user and the transmitted somatosensory information of the associated user, determine the current user and all and the item information of the associated user, wherein the item information includes a scoring result and a somatosensory action image; the item information of the current user and the associated user is taken together as an interaction result.
  • the processor is configured to further implement the following steps when executing the instruction: reporting the somatosensory information to a server; correspondingly, acquiring somatosensory information based on the current user and at least An interaction result processed by the somatosensory information of an associated user includes: acquiring an interaction result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
  • the processor is configured to further implement the following steps when executing the instruction: in response to the current user's operation of opening the live broadcast room and selecting a somatosensory item duel, sending an opponent matching request to the server; at least one associated user that matches the current user fed back by the server, and an interaction start instruction.
  • the processor is configured to further implement the following steps when executing the instruction: in response to the interaction start instruction sent by the server, collect a user picture including the user's somatosensory action; input the user picture into Go to the somatosensory information acquisition model, and acquire the somatosensory information corresponding to the user's somatosensory action output by the somatosensory information acquisition model.
  • the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
  • a computer-readable storage medium When the instructions in the storage medium are executed by a processor of an electronic device, the electronic device can perform the following steps: according to the collected user of the current user a screen, to determine the somatosensory information corresponding to the somatosensory action in the user screen;
  • the interactive result is displayed in combination with the live interface to which the current user screen belongs.
  • the electronic device when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can further perform the following steps: acquiring the somatosensory information based on the current user and at least one of the associations The scoring result determined after the somatosensory action is scored on the user's somatosensory information is used as the interaction result.
  • the electronic device when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can further perform the following step: compare the current user's somatosensory information with at least one preset somatosensory information Yes, determine the target preset somatosensory information that matches the somatosensory information of the current user; use the score corresponding to the target preset somatosensory information as the scoring result of the current user; Scoring result; use the current user's scoring result and at least one associated user's scoring result as the interaction result.
  • the electronic device when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can further perform the following steps: acquiring the somatosensory information based on the current user and the information of at least one associated user.
  • the somatosensory action image obtained after the somatosensory information is processed is used as the interaction result.
  • the electronic device when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device is enabled to further perform the following steps: through the game between the current user and the associated user's respective clients An information link bridge, which transmits user somatosensory information and/or interaction results.
  • the electronic device when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can further perform the following steps: based on the somatosensory information of the current user and the transmitted associated user The somatosensory information is interactively processed to obtain interactive results.
  • the electronic device when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can further perform the following steps: the somatosensory information of the current user and the transmitted somatosensory information of the associated user The information is processed to determine the item information of the current user and the associated user, wherein the item information includes a scoring result and a somatosensory action image; the item information of the current user and the associated user is taken together as an interaction result.
  • the electronic device when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can further perform the following steps: reporting the somatosensory information to the server; correspondingly, obtaining the information based on The interaction result processed by the somatosensory information of the current user and the somatosensory information of at least one associated user includes: acquiring the interactive result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
  • the electronic device when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can further perform the following steps: in response to the current user's operation of opening the live room and selecting a somatosensory item duel, Send an opponent matching request to the server; receive at least one associated user matched with the current user fed back by the server, and an interaction start instruction.
  • the electronic device when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device is further able to perform the following steps: in response to the interaction start instruction sent by the server, collect data including the user's somatosensory action. User screen; input the user screen into the somatosensory information acquisition model, and obtain the somatosensory information corresponding to the user's somatosensory action output by the somatosensory information acquisition model.
  • the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
  • a computer program product for use in conjunction with an electronic device, the computer program product comprising a computer-readable storage medium and a computer program mechanism embedded therein, the program being loaded via a computer And after the execution, the following steps can be performed: according to the collected user screen of the current user, determine the somatosensory information corresponding to the somatosensory action in the user screen;
  • the interactive result is displayed in combination with the live interface to which the current user screen belongs.
  • the following steps can be performed: obtaining a somatosensory action score based on the somatosensory information of the current user and the somatosensory information of at least one associated user, and then determining , as the interactive result.
  • the following steps can be performed: comparing the somatosensory information of the current user with at least one preset somatosensory information, and determining that it matches the somatosensory information of the current user target preset somatosensory information; take the score corresponding to the target preset somatosensory information as the scoring result of the current user; receive the scoring result of the associated user sent by at least one of the associated users; A result, and a rating result of at least one of the associated users are used as the interaction result.
  • the computer program can perform the following steps after being loaded and executed by a computer: acquiring a somatosensory action image obtained by processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user , as a result of the interaction.
  • the following steps can be performed: through the game information link bridge between the respective clients of the current user and the associated user, perform user somatosensory information and/or or transmission of interactive results.
  • the following steps can be performed: perform interactive processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user, so as to obtain interactive result.
  • the following steps can be performed: process the somatosensory information of the current user and the transmitted somatosensory information of the associated user, determine the current user and the somatosensory information of the associated user.
  • the item information of the associated user wherein the item information includes the scoring result and the somatosensory action image; the item information of the current user and the associated user is taken together as the interaction result.
  • the computer program can perform the following steps after being loaded and executed by a computer: reporting the somatosensory information to a server; correspondingly, acquiring somatosensory information based on the current user and at least one
  • the interaction result processed by the somatosensory information of the associated user includes: acquiring the interaction result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
  • the computer program can perform the following steps after being loaded and executed by the computer: in response to the current user's operation of opening the live room and selecting a somatosensory item duel, sending an opponent matching request to the server; receiving the The server feeds back at least one associated user that matches the current user, and an interaction start instruction.
  • the following steps can be performed: in response to an interaction start instruction sent by the server, collecting a user image including the user's somatosensory actions; inputting the user image into the A somatosensory information acquisition model, which acquires somatosensory information corresponding to the user's somatosensory action output by the somatosensory information acquisition model.
  • the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
  • the embodiment of the present disclosure enriches the interactive form between the anchor and the audience by performing a somatosensory game in the live broadcast room, and can realize the interaction between multiple anchors, enhance the interactivity of the live broadcast, and enrich the scene of the interaction between the live broadcasters.
  • Fig. 1 is a flow chart of a method for interacting with a live broadcast somatosensory item according to an exemplary embodiment.
  • Fig. 2a is a flow chart of a method for interacting with a live somatosensory item according to an exemplary embodiment.
  • Fig. 2b is a schematic diagram showing the interaction of a live somatosensory item according to an exemplary embodiment.
  • Fig. 3a is a flow chart of a method for interacting with a live broadcast somatosensory item according to an exemplary embodiment.
  • Fig. 3b is a schematic diagram showing interactive result transmission through a game information connection bridge according to an exemplary embodiment.
  • Fig. 4a is a flow chart of a method for interacting with a live somatosensory item according to an exemplary embodiment.
  • Fig. 4b is a schematic diagram showing opponent matching of a live somatosensory project according to an exemplary embodiment.
  • Fig. 5 is a block diagram of an interactive device for a live somatosensory item according to an exemplary embodiment.
  • FIG. 6 is a schematic structural diagram of an electronic device according to an exemplary embodiment.
  • the live broadcast is interactive, and there is no interaction between multiple users, and the embodiments of the present disclosure may be applicable to the case of interacting in the form of live broadcast somatosensory items in the live broadcast room.
  • the embodiment of the present disclosure determines the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user, and then obtains the interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein , the associated user is the user who matches the current user, and finally the interaction results are displayed in the live broadcast interface to which the current user screen belongs, which enriches the interaction between users, and can realize the interaction between multiple users, which enriches the user interaction in the live broadcast room. scene.
  • Fig. 1 is a flow chart of a method for interacting with live somatosensory items according to an exemplary embodiment. As shown in Fig. 1 , the method for interacting with live somatosensory items is used in an electronic device and executed by a processor configured in the electronic device , the method includes the following steps.
  • S11 Determine the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user.
  • the user picture is a picture collected by the client that contains the user's somatosensory actions, and is used to obtain the user's somatosensory information.
  • the user picture may be the picture collected by the client's camera when the user blinks or shakes his head; the somatosensory information is obtained from the user's head Information that can characterize the current user's somatosensory action extracted from the somatosensory movement of the body, face, or limbs, for example, the somatosensory information includes shaking head movement, the amplitude of shaking the head, and the frequency of shaking the head and other information that can characterize the user's current somatosensory movement.
  • the client collects the user picture of the current user through the camera, then identifies the user picture, extracts the features of the somatosensory action in the user picture, and constitutes the user's somatosensory information by at least one somatosensory action feature.
  • the front camera of the client collects the current user's user screen in real time, performs image recognition on the user screen, and obtains the current user's body movement as a blink.
  • Information such as "twice per second" is encapsulated as somatosensory information corresponding to somatosensory actions on the user screen.
  • the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
  • the somatosensory information includes at least one motion information of the user's head movement, facial movement, and limb movement.
  • the somatosensory information may be shaking his head left and right or nodding, and the amplitude and frequency of shaking his head or nodding, and also It can be facial movements such as the user's mouth opening and blinking, and motion features such as eye blinking frequency, or the user's action of raising an arm and the characteristics of the action, such as the magnitude of the raised arm and the angle between the arm and the body.
  • S12 Acquire an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, where the associated user is a user matched with the current user.
  • the interaction result is the result obtained by separately processing the current user and the somatosensory information corresponding to at least one associated user, and performing an interactive operation on the processing result, and the interactive operation may be comparing and calculating the processing results corresponding to each user. Or simply superimposed, for example, a somatosensory action score may be performed with respect to the somatosensory information of each user, and the set of scoring results of each user may be used as the interaction result.
  • the client obtains the interaction result between the current user and at least one associated user.
  • the interactive result may be obtained by the client performing interactive processing on the somatosensory information of the current user and at least one associated user, or may be obtained by processing the somatosensory information of the current user and receiving the result of processing the respective somatosensory information by at least one associated user , and finally the processing results of each user's somatosensory information are used as the interactive result.
  • the current user and each associated user can also report their somatosensory information to the server, and finally obtain the somatosensory information based on the current user and at least one association from the server.
  • the user's somatosensory information is processed to obtain an interactive result.
  • the client according to the somatosensory information of the current user, according to the set project rules, scores the somatosensory actions of the current user, and obtains a corresponding scoring result, for example, 85 points, and simultaneously receives and
  • the rating result sent by the associated user matched by the current user is 80 points.
  • the somatosensory action rating of each user can be used as the interactive result.
  • the interaction method can be to compare the ratings of each user, and the user with the highest rating will win the project.
  • the interactive result is displayed on the live interface to which the current user's user screen belongs. For example, if the interaction result is the item ratings of the current user and at least one associated user, the item ratings of the current user and associated users are displayed on the current user screen at the same time. The word "Winner" is displayed for the location.
  • the electronic device can determine the somatosensory information corresponding to the somatosensory action in the user screen, obtain the interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, and finally combine the interaction results It is displayed in the live interface to which the current user screen belongs, so as to realize the interaction in the form of somatosensory items in the live broadcast room. Therefore, the present disclosure enriches the interaction form between users, realizes the interaction between multiple users, and enriches the live broadcast room. User interaction scenarios.
  • Fig. 2a is a flowchart illustrating a method for interacting with a live somatosensory item according to an exemplary embodiment.
  • the embodiment of the present disclosure is a refinement of the above technical solution.
  • the technical solution in the embodiment of the present disclosure may be combined with one or more of the above-mentioned technical solutions.
  • the various alternatives in the embodiments are combined.
  • the interactive method of the live somatosensory item includes the following steps.
  • S22 Acquire a scoring result determined by scoring a somatosensory action based on the somatosensory information of the current user and the somatosensory information of at least one associated user, as an interaction result.
  • a method for obtaining an interaction result respectively obtaining a scoring result determined after performing a somatosensory action score based on the somatosensory information of the current user, and a score determined after performing a somatosensory action scoring based on the somatosensory information of at least one associated user.
  • the rating results of each user are collectively used as the interactive result.
  • Obtain the scoring result based on the somatosensory information of each user which can be for the somatosensory information of the current user, according to the scoring rules of the live broadcast somatosensory item, to score the somatosensory actions of the current user, obtain the scoring result of the current user, and then receive at least one associated user.
  • the respective scoring results of the current user and at least one associated user are finally used as the interactive result; of course, to obtain the scoring results based on the somatosensory information of each user, it is also possible to obtain the somatosensory information of the current user while receiving at least one The somatosensory information sent by the associated user that matches each associated user, and then according to the scoring rules of the live somatosensory item, the somatosensory actions of each user are scored, and finally the set of obtained scoring results of each user is used as the interactive result.
  • the specific operation mode of the live broadcast somatosensory item is to control the beating of the cartoon characters displayed on the screen by blinking. , specifically, for every blink, the cartoon character jumps forward once, and gets 1 point.
  • the client's acquisition of the current user's somatosensory information blinking at a frequency of 2 times per second, the user's somatosensory actions are scored according to the number of blinks of the user's eyes. 15 seconds after the project starts, the current user's score is 30 points.
  • the client will receive the real-time scoring result sent by the associated user. 15 seconds after the project starts, if the scoring result sent by the associated user is 25 points, the current user and associated user's rating at that moment will be used as the interactive result.
  • obtaining a scoring result determined after performing somatosensory action scoring based on the somatosensory information of the current user and the somatosensory information of at least one associated user, as the interaction result includes:
  • the rating result of the current user and the rating result of at least one of the associated users are used as the interaction result.
  • a scoring result determined after obtaining a somatosensory action score based on the somatosensory information of the current user and the somatosensory information of at least one associated user. Comparing with multiple preset somatosensory information, in response to the current user somatosensory information matching a certain preset somatosensory information, the preset somatosensory information is used as the target preset somatosensory information, and then according to the correspondence between the preset somatosensory information and the score relationship, determine the score corresponding to the target preset somatosensory information, and finally use the corresponding score of the target preset somatosensory information as the scoring result of the current user, and at the same time, receive the scoring result of the associated user sent by at least one associated user, wherein each associated user The scoring result is obtained by processing the somatosensory information of each associated user at the corresponding client of each associated user and sent to the current client, and finally the scoring result of the current
  • the specific operation method of the live somatosensory project is to swing the head left and right following the marked position displayed on the client screen, and take pictures at random moments, and score according to the matching degree between the user's head position and the marked position at the time of taking the photo, the higher the matching degree , the higher the score of the user's somatosensory action, the corresponding relationship between each preset somatosensory information (corresponding to multiple preset positions) and the score is preset in the live somatosensory project, and the obtained somatosensory information of the current user includes the user's head position information and mark location information, compare the user somatosensory information with multiple preset somatosensory information, take the preset somatosensory information that matches the current user somatosensory information as the target preset somatosensory information, and use the score corresponding to the target preset somatosensory information as the target preset somatosensory information.
  • the current user's rating results results.
  • the somatosensory action obtained after processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user may also be obtained image, as a result of interaction.
  • the somatosensory action image is an image obtained based on the user's somatosensory information and includes the user's somatosensory action and other information related to the somatosensory information.
  • the somatosensory action image may include the current user's somatosensory action, and superimpose a magic expression on the user's somatosensory action. obtained image.
  • the current client can process the acquired somatosensory information of the current user to obtain a somatosensory action image, and process the received somatosensory action sent by at least one associated user to obtain a somatosensory action image that matches the associated user , and finally use the somatosensory action image of the current user and at least one associated user as the interaction result; the current client can also process only the somatosensory action of the current user to obtain the somatosensory action image, and then receive the somatosensory action image sent by at least one associated user, and The somatosensory motion image of the current user and at least one associated user is used as the interaction result.
  • the technical solution of the embodiment of the present disclosure is to determine the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user, and obtain the somatosensory action score based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
  • the rating results determined later are used as the interactive results, and the interactive results are finally displayed in the live broadcast interface to which the current user screen belongs.
  • Fig. 3a is a flowchart illustrating a method for interacting with a live somatosensory item according to an exemplary embodiment.
  • the embodiment of the present disclosure is a refinement of the above technical solution.
  • the technical solution in the embodiment of the present disclosure may be combined with one or more of the above-mentioned technical solutions.
  • the various alternatives in the embodiments are combined.
  • the interactive method for live broadcast somatosensory items includes the following steps.
  • S31 Determine the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user.
  • the Game Information Bridge (GIB) technology is a data exchange service technology based on the extension of the UDP protocol. Packet, and according to the type of data broadcast, will do corresponding processing, such as anti-packet loss type data, the receiver will send the corresponding response data packet (Acknowledge character, ACK), in response to the sender does not receive the ACK packet, Retransmission and other behaviors will be performed.
  • the game information link bridge service has the characteristics of reliability and low latency.
  • information is transmitted through the game information link bridge between the respective clients of the current user and the associated user, and the transmitted information may be user somatosensory information and/or interaction results.
  • the current client can send the current user's somatosensory information to at least one associated user through the game information link bridge, and at the same time receive the user somatosensory information sent by at least one associated user through the game information link bridge.
  • the information is processed to obtain the interaction result; the current client can also directly transmit the interaction result through the game connection bridge, as shown in Figure 3b, the interaction result of the current user, such as the rating result, is sent to at least one associated user, At the same time, through the game information link bridge, the scoring result corresponding to each associated user sent by at least one associated user is received, and the scoring result of each user is finally used as the final interactive result.
  • S33 Perform interactive processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain an interactive result.
  • the current client transmits the user's somatosensory information through the game information link bridge, it performs interactive processing on the somatosensory information of the current user and the somatosensory information of the associated user to obtain a final interactive result.
  • the client uses preset scoring rules for live-streaming somatosensory items to calculate the scores of the current user and associated users' somatosensory actions, and uses the scoring results of all users as the interactive results.
  • processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain the interaction result includes:
  • the item information of the current user and the associated user is taken together as an interaction result.
  • a specific method for processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain the interactive result is provided.
  • the somatosensory information of the current user and the associated user are linked through the game information.
  • the somatosensory information of the associated user transmitted by the bridge is processed, for example, the somatosensory action of each user is scored according to the somatosensory information, or the somatosensory action image displayed on the client terminal is controlled according to the somatosensory information, and finally the item information of the current user and the associated user is obtained, and the The user's item information is collectively used as an interaction result, wherein the item information may include scoring results and somatosensory action images.
  • the method further includes:
  • acquiring the interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user includes:
  • a specific method for obtaining an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user is also provided.
  • the information is reported to the server, and the server processes the somatosensory information reported by the current user and the somatosensory information reported by at least one associated user to obtain interactive results and feed them back to the client corresponding to each user.
  • the server scores the somatosensory information reported by each user according to a preset scoring rule, obtains the scoring result of each user, and feeds back the scoring result of each user to each user.
  • the somatosensory information corresponding to the somatosensory actions in the user screen is determined according to the collected user screen of the current user, and through the game information link bridge between the current user and the associated user's respective clients, the user Transmission of somatosensory information and/or interaction results, after the user somatosensory information is transmitted, interactive processing is performed based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain the interactive result, and finally the interaction result is combined with the current user's screen
  • Fig. 4a is a flowchart showing a method for interacting with a live somatosensory item according to an exemplary embodiment.
  • the embodiment of the present disclosure is a refinement of the above technical solution.
  • the technical solution in the embodiment of the present disclosure may be combined with one or more of the above-mentioned technical solutions.
  • the various alternatives in the embodiments are combined.
  • the interactive method of the live somatosensory item includes the following steps.
  • the live broadcast somatosensory item is in the form of multiple user interactions. Therefore, after the user opens the live broadcast room and selects the somatosensory item duel option, the client sends an opponent matching request to the server, wherein the opponent matching request includes the current user's ID, and identification information of the somatosensory item currently selected by the user.
  • the server may determine the somatosensory item selected by the current user according to the somatosensory item identification information contained in the opponent matching request, then obtain all users who have selected the somatosensory item, and randomly select one or more users from all users to match the current user, and the number of selected It is related to the somatosensory item confrontation mode selected by the user.
  • S42 Receive at least one associated user fed back by the server that matches the current user, and an interaction start instruction.
  • the client terminal receives at least one associated user matched with the current user fed back by the server for the opponent matching request, and simultaneously receives the interaction start instruction fed back by the server, and starts the live broadcast of the somatosensory item in response to the instruction.
  • the client receives the ID of at least one associated user fed back by the server, determines one or more other users that compete with the current user, displays the relevant information of the associated user in the corresponding position on the screen, and starts the countdown after receiving the live somatosensory item fed back by the server. , and a countdown is displayed on the screen to prompt the user to start the live broadcast of the somatosensory project.
  • the client starts to collect the user screen of the current user, and the user screen contains the somatosensory actions of the current user, which are used to determine the interaction results between multiple users according to the somatosensory actions of the user .
  • an implementation manner of determining the somatosensory information corresponding to the somatosensory action in the user screen according to the user screen is provided, the collected user screen is input into the somatosensory information acquisition model, and finally the somatosensory information input by the somatosensory information acquisition model is input.
  • the information is the somatosensory information corresponding to the current user's somatosensory action.
  • the somatosensory information acquisition model may be a model capable of recognizing the input user screen and obtaining the user's somatosensory motion characteristics. For example, when the user shakes his head, the user's somatosensory motion type can be determined as shaking his head, and the amplitude and frequency of shaking his head can be obtained.
  • S45 Acquire an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, where the associated user is a user matched with the current user.
  • an opponent matching request is sent to the server, and then in response to an interaction start instruction sent by the server, the user screen containing the user's somatosensory action is started to be collected, Input the user screen into the somatosensory information acquisition model, obtain the somatosensory information output by the somatosensory information acquisition model corresponding to the user's somatosensory action, and finally obtain the interactive result of processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user, and
  • the interactive results are displayed in the live interface to which the current user's user screen belongs, and can interact in the form of somatosensory items in the live broadcast room, which enriches the user interaction scene in the live broadcast room.
  • Fig. 5 is a block diagram of an interactive device for a live somatosensory item according to an exemplary embodiment.
  • the apparatus includes a somatosensory information determination module 510 , an interaction result acquisition module 520 and an interaction result display module 530 .
  • the somatosensory information determining module 510 is configured to determine the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user;
  • the interaction result acquisition module 520 is configured to perform acquisition of an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein the associated user is a user matched with the current user;
  • the interactive result display module 530 is configured to display the interactive result in combination with the live interface to which the current user screen belongs.
  • the interaction result obtaining module 520 includes:
  • the first interaction result obtaining unit is configured to execute a scoring result determined after performing a somatosensory action scoring based on the somatosensory information of the current user and the somatosensory information of at least one of the associated users, as the interaction result.
  • the first interaction result obtaining unit is further configured to execute:
  • the rating result of the current user and the rating result of at least one of the associated users are used as the interaction result.
  • the interaction result obtaining module 520 is further configured to execute:
  • a somatosensory action image obtained by processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user is acquired as the interaction result.
  • the interaction result obtaining module 520 is further configured to execute:
  • the user somatosensory information and/or the interaction result is transmitted through the game information link bridge between the respective clients of the current user and the associated user.
  • the interaction result obtaining module 520 further includes:
  • the second interaction result obtaining unit is configured to perform interaction processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user, so as to obtain an interaction result.
  • the second interaction result obtaining unit is further configured to execute:
  • the item information of the current user and the associated user is taken together as an interaction result.
  • the interactive device for live broadcasting of somatosensory items further includes:
  • an information reporting module configured to perform reporting of the somatosensory information to the server
  • the interaction result obtaining module 520 is further configured to execute:
  • the interactive device for live broadcasting of somatosensory items further includes:
  • the request sending module is configured to perform the operation of opening the live room and selecting the somatosensory item duel in response to the current user, and send an opponent matching request to the server;
  • the instruction receiving module is configured to receive at least one associated user that is fed back by the server and matched with the current user, and an interaction start instruction.
  • the somatosensory information determination module 510 includes:
  • a user image acquisition unit configured to execute an interaction start instruction sent by the server to collect user images including user somatosensory actions
  • the somatosensory information acquisition unit is configured to execute inputting the user screen into a somatosensory information acquisition model, and acquire somatosensory information corresponding to the user's somatosensory action output by the somatosensory information acquisition model.
  • the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
  • Fig. 6 is a schematic structural diagram of an electronic device shown according to an exemplary embodiment. As shown in Fig. 6, the electronic device includes:
  • a processor 610 is taken as an example in FIG. 6;
  • the processor 610 and the memory 620 in the device may be connected by a bus or in other ways, and the connection by a bus is taken as an example in FIG. 6 .
  • the memory 620 can be used to store software programs, computer-executable programs, and modules, such as program instructions/modules (for example, program instructions/modules corresponding to a live somatosensory item interaction method in the embodiments of the present disclosure).
  • Figure 5 shows the somatosensory information determination module 510, the interaction result acquisition module 520 and the interaction result display module 530).
  • the processor 610 executes various functional applications and data processing of the computer device by running the software programs, instructions and modules stored in the memory 620, that is, a method for interacting with a live somatosensory item of the above method embodiment, namely:
  • the interactive result is displayed in the live interface to which the current user screen belongs.
  • the memory 620 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the computer equipment, and the like. Additionally, memory 620 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 620 may optionally include memory located remotely from the processor 610, and these remote memories may be connected to the terminal device through a network. Examples of such networks include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • a storage medium including instructions such as a memory 620 including instructions, is also provided, and the above-mentioned instructions can be executed by the processor 610 of the electronic device to complete the above-mentioned method.
  • the storage medium may be a non-transitory computer-readable storage medium such as ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
  • a computer program product for use in conjunction with an electronic device, the computer program product comprising a computer readable storage medium and a computer program mechanism embedded therein, the program being loaded via a computer And after execution, the interactive method for live broadcast somatosensory items as described in any embodiment of the present disclosure can be realized.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A livestream motion sensing project interaction method and apparatus. The method comprises: determining, according to an acquired user image of a current user, motion sensing information corresponding to a motion sensing action in a user image; obtaining an interaction result of a processing performed on the basis of the motion sensing information of the current user and motion sensing information of at least one associated user, wherein the associated user is a user matched with the current user; displaying the interaction result on a livestream interface of the image of the current user, such that the user can play a motion sensing project in a livestream room, thus enriching the user interaction form, and the interaction between multiple users can be achieved, thus enriching user interaction scenes in the livestream room.

Description

一种直播体感项目互动方法及装置A kind of interactive method and device for live broadcast somatosensory project
相关申请的交叉引用CROSS-REFERENCE TO RELATED APPLICATIONS
本公开要求于2020年7月24日提交的中国专利申请号“202010725851.8”的优先权,其全部内容通过引用并入本文。The present disclosure claims priority to Chinese Patent Application No. "202010725851.8" filed on Jul. 24, 2020, the entire contents of which are incorporated herein by reference.
技术领域technical field
本公开涉及计算机技术领域,尤其涉及一种直播体感项目互动方法及装置。The present disclosure relates to the field of computer technologies, and in particular, to a method and device for interacting with live broadcast motion sensing items.
背景技术Background technique
随着直播技术的发展,通过各种直播类应用进行直播或者观看其他人直播已经成为人们休闲娱乐的方式之一。在直播过程中,如何增强直播的互动性,让用户有更好的体验,已经成为重要的研究方向之一。With the development of live broadcast technology, live broadcast through various live broadcast applications or watching other people's live broadcast has become one of the ways for people to enjoy leisure and entertainment. In the process of live broadcast, how to enhance the interactivity of live broadcast and let users have a better experience has become one of the important research directions.
发明内容SUMMARY OF THE INVENTION
本公开实施例提供一种直播体感项目互动方法及装置,本公开的技术方案如下:Embodiments of the present disclosure provide a method and device for interacting with live somatosensory items. The technical solutions of the present disclosure are as follows:
根据本公开的一些实施例,提供了一种直播体感项目互动方法,该方法应用于客户端,包括:According to some embodiments of the present disclosure, a method for interacting with a live broadcast somatosensory item is provided, and the method is applied to a client, including:
根据采集到的当前用户的用户画面,确定与所述用户画面中的体感动作对应的体感信息;According to the collected user screen of the current user, determine the somatosensory information corresponding to the somatosensory action in the user screen;
获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,其中,所述关联用户是与当前用户匹配的用户;acquiring an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein the associated user is a user matched with the current user;
将所述互动结果结合在当前用户画面所属直播界面中进行展示。The interactive result is displayed in combination with the live interface to which the current user screen belongs.
在一种可能的实现方式中,获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果包括:获取基于所述当前用户的体感信息以及至少一个所述关联用户的体感信息进行体感动作评分后确定的评分结果,作为所述互动结果。In a possible implementation manner, acquiring an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user includes: acquiring the somatosensory information based on the current user and the somatosensory information of at least one associated user. The scoring result determined after the somatosensory information is scored for the somatosensory action is used as the interaction result.
在一种可能的实现方式中,获取基于所述当前用户的体感信息以及至少一个所述关联用户的体感信息进行体感动作评分后确定的评分结果,作为所述互动结果包括:将当前用户的体感信息与至少一个预设体感信息进行比对,确定与当前用户的体感信息匹配的目标预设体感信息;将所述目标预设体感信息对应的分数,作为当前用户的评分结果;接收至少一个所述关联用户发送的所述关联用户的评分结果;将所述当前用户的评分结果,以及 至少一个所述关联用户的评分结果作为所述互动结果。In a possible implementation manner, a scoring result determined after somatosensory action scoring is performed based on the somatosensory information of the current user and the somatosensory information of at least one associated user is obtained, and the interaction result includes: The information is compared with at least one preset somatosensory information to determine the target preset somatosensory information that matches the somatosensory information of the current user; the score corresponding to the target preset somatosensory information is used as the scoring result of the current user; The rating result of the associated user sent by the associated user; the rating result of the current user and the rating result of at least one associated user are used as the interaction result.
在一种可能的实现方式中,获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果包括:获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理后得到的体感动作图像,作为所述互动结果。In a possible implementation manner, acquiring the interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user includes: acquiring the somatosensory information based on the current user and the somatosensory information of at least one associated user The somatosensory action image obtained after processing is used as the interaction result.
在一种可能的实现方式中,获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果包括:通过所述当前用户和关联用户各自客户端之间的游戏信息链接桥,进行用户体感信息和/或互动结果的传输。In a possible implementation manner, acquiring the interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user includes: linking game information between the respective clients of the current user and the associated user Bridge, which transmits user somatosensory information and/or interaction results.
在一种可能的实现方式中,所述方法还包括:基于所述当前用户的体感信息以及传输的所述关联用户的体感信息进行互动处理,以获取互动结果。In a possible implementation manner, the method further includes: performing interactive processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain an interactive result.
在一种可能的实现方式中,基于所述当前用户的体感信息以及传输的所述关联用户的体感信息进行处理,以获取互动结果包括:对当前用户的体感信息以及传输的所述关联用户的体感信息进行处理,确定当前用户和所述关联用户的项目信息,其中,所述项目信息包括评分结果和体感动作图像;将当前用户和所述关联用户的项目信息共同作为互动结果。In a possible implementation manner, the processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain the interaction result includes: the somatosensory information of the current user and the transmitted somatosensory information of the associated user. The somatosensory information is processed to determine the item information of the current user and the associated user, wherein the item information includes a scoring result and a somatosensory action image; the item information of the current user and the associated user is taken together as an interaction result.
在一种可能的实现方式中,所述方法还包括:将所述体感信息上报给服务器;相应的,则获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果包括:获取所述服务器基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果。In a possible implementation manner, the method further includes: reporting the somatosensory information to a server; correspondingly, acquiring an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user The method includes: acquiring an interaction result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
在一种可能的实现方式中,所述方法还包括:响应于当前用户开启直播间并选择体感项目对决的操作,向服务器发送对手匹配请求;接收所述服务器反馈的与当前用户匹配的至少一个关联用户,以及互动开始指令。In a possible implementation manner, the method further includes: in response to the operation of the current user opening the live room and selecting the somatosensory item duel, sending an opponent matching request to the server; receiving at least one matching feedback from the server that matches the current user Associate users, and interaction start instructions.
在一种可能的实现方式中,根据采集到的当前用户的用户画面,确定与所述用户画面中的体感动作对应的体感信息包括:响应于服务器发送的互动开始指令,采集包含用户体感动作的用户画面;将所述用户画面输入至体感信息获取模型,获取所述体感信息获取模型输出的与所述用户体感动作对应的体感信息。In a possible implementation manner, according to the collected user screen of the current user, determining the somatosensory information corresponding to the somatosensory action in the user screen includes: in response to an interaction start instruction sent by the server, collecting data including the somatosensory action of the user. User screen; input the user screen into the somatosensory information acquisition model, and obtain the somatosensory information corresponding to the user's somatosensory action output by the somatosensory information acquisition model.
在一种可能的实现方式中,所述体感信息包含用户头部运动、面部运动和四肢运动中的至少一项运动信息。In a possible implementation manner, the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
根据本公开的一些实施例,提供了一种直播体感项目互动装置,该装置包括:According to some embodiments of the present disclosure, there is provided an interactive device for live broadcast somatosensory items, the device comprising:
体感信息确定模块,被配置为执行根据采集到的当前用户的用户画面,确定与所述用户画面中的体感动作对应的体感信息;a somatosensory information determining module, configured to determine the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user;
互动结果获取模块,被配置为执行获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,其中,所述关联用户是与当前用户匹配的用户;an interaction result acquisition module configured to perform acquisition of an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein the associated user is a user matched with the current user;
互动结果展示模块,被配置为执行将所述互动结果结合在当前用户画面所属直播界面中进行展示。The interactive result display module is configured to display the interactive result in combination with the live interface to which the current user screen belongs.
在一种可能的实现方式中,互动结果获取模块,包括:In a possible implementation manner, the interactive result acquisition module includes:
第一互动结果获取单元,被配置为执行获取基于所述当前用户的体感信息以及至少一个所述关联用户的体感信息进行体感动作评分后确定的评分结果,作为所述互动结果。The first interaction result obtaining unit is configured to execute a scoring result determined after performing a somatosensory action scoring based on the somatosensory information of the current user and the somatosensory information of at least one of the associated users, as the interaction result.
在一种可能的实现方式中,第一互动结果获取单元,还被配置为执行:In a possible implementation manner, the first interaction result obtaining unit is further configured to execute:
将当前用户的体感信息与至少一个预设体感信息进行比对,确定与当前用户的体感信息匹配的目标预设体感信息;Compare the somatosensory information of the current user with at least one preset somatosensory information, and determine the target preset somatosensory information that matches the somatosensory information of the current user;
将所述目标预设体感信息对应的分数,作为当前用户的评分结果;Taking the score corresponding to the preset somatosensory information of the target as the scoring result of the current user;
接收至少一个所述关联用户发送的所述关联用户的评分结果;receiving the scoring result of the associated user sent by at least one of the associated users;
将所述当前用户的评分结果,以及至少一个所述关联用户的评分结果作为所述互动结果。The rating result of the current user and the rating result of at least one of the associated users are used as the interaction result.
在一种可能的实现方式中,互动结果获取模块,还被配置为执行:In a possible implementation, the interactive result acquisition module is also configured to execute:
获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理后得到的体感动作图像,作为所述互动结果。A somatosensory action image obtained by processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user is acquired as the interaction result.
在一种可能的实现方式中,互动结果获取模块,还被配置为执行:In a possible implementation, the interactive result acquisition module is also configured to execute:
通过所述当前用户和关联用户各自客户端之间的游戏信息链接桥,进行用户体感信息和/或互动结果的传输。The user somatosensory information and/or the interaction result is transmitted through the game information link bridge between the respective clients of the current user and the associated user.
在一种可能的实现方式中,互动结果获取模块,还包括:In a possible implementation manner, the interactive result acquisition module further includes:
第二互动结果获取单元,被配置为执行基于所述当前用户的体感信息以及传输的所述关联用户的体感信息进行互动处理,以获取互动结果。The second interaction result obtaining unit is configured to perform interaction processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user, so as to obtain an interaction result.
在一种可能的实现方式中,第二互动结果获取单元,还被配置为执行:In a possible implementation manner, the second interaction result obtaining unit is further configured to execute:
对当前用户的体感信息以及传输的所述关联用户的体感信息进行处理,确定当前用户和所述关联用户的项目信息,其中,所述项目信息包括评分结果和体感动作图像;Process the somatosensory information of the current user and the transmitted somatosensory information of the associated user, and determine the item information of the current user and the associated user, wherein the item information includes a scoring result and a somatosensory action image;
将当前用户和所述关联用户的项目信息共同作为互动结果。The item information of the current user and the associated user is taken together as an interaction result.
在一种可能的实现方式中,直播体感项目互动装置,还包括:In a possible implementation manner, the interactive device for live broadcast of somatosensory items further includes:
信息上报模块,被配置为执行将所述体感信息上报给服务器;an information reporting module, configured to perform reporting of the somatosensory information to the server;
相应的,互动结果获取模块,被配置为执行:Correspondingly, the interactive result acquisition module is configured to execute:
获取所述服务器基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果。Obtain an interaction result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
在一种可能的实现方式中,直播体感项目互动装置,还包括:In a possible implementation manner, the interactive device for live broadcast of somatosensory items further includes:
请求发送模块,被配置为执行响应于当前用户开启直播间并选择体感项目对决的操作,向服务器发送对手匹配请求;The request sending module is configured to perform the operation of opening the live room and selecting the somatosensory item duel in response to the current user, and send an opponent matching request to the server;
指令接收模块,被配置为执行接收所述服务器反馈的与当前用户匹配的至少一个关联用户,以及互动开始指令。The instruction receiving module is configured to receive at least one associated user that is fed back by the server and matched with the current user, and an interaction start instruction.
在一种可能的实现方式中,体感信息确定模块,包括:In a possible implementation manner, the somatosensory information determination module includes:
用户画面采集单元,被配置为执行响应于服务器发送的互动开始指令,采集包含用户体感动作的用户画面;a user image acquisition unit, configured to execute an interaction start instruction sent by the server to collect user images including user somatosensory actions;
体感信息获取单元,被配置为执行将所述用户画面输入至体感信息获取模型,获取所述体感信息获取模型输出的与所述用户体感动作对应的体感信息。The somatosensory information acquisition unit is configured to execute inputting the user screen into a somatosensory information acquisition model, and acquire somatosensory information corresponding to the user's somatosensory action output from the somatosensory information acquisition model.
在一种可能的实现方式中,所述体感信息包含用户头部运动、面部运动和四肢运动中的至少一项运动信息。In a possible implementation manner, the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
根据本公开的一些实施例,提供了一种电子设备,包括:处理器;用于存储所述处理器可执行指令的存储器;其中,所述处理器被配置为执行所述指令,以实现如下步骤:根据采集到的当前用户的用户画面,确定与所述用户画面中的体感动作对应的体感信息;According to some embodiments of the present disclosure, there is provided an electronic device comprising: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to execute the instructions to achieve the following Step: determine the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user;
获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,其中,所述关联用户是与当前用户匹配的用户;acquiring an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein the associated user is a user matched with the current user;
将所述互动结果结合在当前用户画面所属直播界面中进行展示。The interactive result is displayed in combination with the live interface to which the current user screen belongs.
在一种可能的实现方式中,所述处理器被配置为执行所述指令时还实现以下步骤:获取基于所述当前用户的体感信息以及至少一个所述关联用户的体感信息进行体感动作评分后确定的评分结果,作为所述互动结果。In a possible implementation manner, the processor is configured to further implement the following step when executing the instruction: after obtaining a somatosensory action score based on the somatosensory information of the current user and the somatosensory information of at least one associated user The determined scoring result is used as the interaction result.
在一种可能的实现方式中,所述处理器被配置为执行所述指令时还实现以下步骤:将当前用户的体感信息与至少一个预设体感信息进行比对,确定与当前用户的体感信息匹配的目标预设体感信息;将所述目标预设体感信息对应的分数,作为当前用户的评分结果;接收至少一个所述关联用户发送的所述关联用户的评分结果;将所述当前用户的评分结果,以及至少一个所述关联用户的评分结果作为所述互动结果。In a possible implementation manner, the processor is configured to further implement the following steps when executing the instruction: compare the somatosensory information of the current user with at least one preset somatosensory information, and determine the somatosensory information of the current user The matching target preset somatosensory information; take the score corresponding to the target preset somatosensory information as the scoring result of the current user; receive the scoring result of the associated user sent by at least one of the associated users; The rating result, and the rating result of at least one of the associated users are used as the interaction result.
在一种可能的实现方式中,所述处理器被配置为执行所述指令时还实现以下步骤:获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理后得到的 体感动作图像,作为所述互动结果。In a possible implementation manner, the processor is configured to further implement the following step when executing the instruction: acquiring a somatosensory action obtained after processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user image, as a result of said interaction.
在一种可能的实现方式中,所述处理器被配置为执行所述指令时还实现以下步骤:通过所述当前用户和关联用户各自客户端之间的游戏信息链接桥,进行用户体感信息和/或互动结果的传输。In a possible implementation manner, the processor is configured to further implement the following steps when executing the instructions: through the game information link bridge between the respective clients of the current user and the associated user, perform user somatosensory information and /or the transmission of interactive results.
在一种可能的实现方式中,所述处理器被配置为执行所述指令时还实现以下步骤:基于所述当前用户的体感信息以及传输的所述关联用户的体感信息进行互动处理,以获取互动结果。In a possible implementation manner, the processor is configured to further implement the following step when executing the instruction: perform interactive processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain Interactive results.
在一种可能的实现方式中,所述处理器被配置为执行所述指令时还实现以下步骤:对当前用户的体感信息以及传输的所述关联用户的体感信息进行处理,确定当前用户和所述关联用户的项目信息,其中,所述项目信息包括评分结果和体感动作图像;将当前用户和所述关联用户的项目信息共同作为互动结果。In a possible implementation manner, the processor is configured to further implement the following steps when executing the instruction: process the somatosensory information of the current user and the transmitted somatosensory information of the associated user, determine the current user and all and the item information of the associated user, wherein the item information includes a scoring result and a somatosensory action image; the item information of the current user and the associated user is taken together as an interaction result.
在一种可能的实现方式中,所述处理器被配置为执行所述指令时还实现以下步骤:将所述体感信息上报给服务器;相应的,则获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果包括:获取所述服务器基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果。In a possible implementation manner, the processor is configured to further implement the following steps when executing the instruction: reporting the somatosensory information to a server; correspondingly, acquiring somatosensory information based on the current user and at least An interaction result processed by the somatosensory information of an associated user includes: acquiring an interaction result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
在一种可能的实现方式中,所述处理器被配置为执行所述指令时还实现以下步骤:响应于当前用户开启直播间并选择体感项目对决的操作,向服务器发送对手匹配请求;接收所述服务器反馈的与当前用户匹配的至少一个关联用户,以及互动开始指令。In a possible implementation manner, the processor is configured to further implement the following steps when executing the instruction: in response to the current user's operation of opening the live broadcast room and selecting a somatosensory item duel, sending an opponent matching request to the server; at least one associated user that matches the current user fed back by the server, and an interaction start instruction.
在一种可能的实现方式中,所述处理器被配置为执行所述指令时还实现以下步骤:响应于服务器发送的互动开始指令,采集包含用户体感动作的用户画面;将所述用户画面输入至体感信息获取模型,获取所述体感信息获取模型输出的与所述用户体感动作对应的体感信息。In a possible implementation manner, the processor is configured to further implement the following steps when executing the instruction: in response to the interaction start instruction sent by the server, collect a user picture including the user's somatosensory action; input the user picture into Go to the somatosensory information acquisition model, and acquire the somatosensory information corresponding to the user's somatosensory action output by the somatosensory information acquisition model.
在一种可能的实现方式中,所述体感信息包含用户头部运动、面部运动和四肢运动中的至少一项运动信息。In a possible implementation manner, the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
根据本公开的一些实施例,提供了一种计算机可读存储介质,所述存储介质中的指令由电子设备的处理器执行时,使得电子设备能够执行以下步骤:根据采集到的当前用户的用户画面,确定与所述用户画面中的体感动作对应的体感信息;According to some embodiments of the present disclosure, a computer-readable storage medium is provided. When the instructions in the storage medium are executed by a processor of an electronic device, the electronic device can perform the following steps: according to the collected user of the current user a screen, to determine the somatosensory information corresponding to the somatosensory action in the user screen;
获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,其中,所述关联用户是与当前用户匹配的用户;acquiring an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein the associated user is a user matched with the current user;
将所述互动结果结合在当前用户画面所属直播界面中进行展示。The interactive result is displayed in combination with the live interface to which the current user screen belongs.
在一种可能的实现方式中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:获取基于所述当前用户的体感信息以及至少一个所述关联用户的体感信息进行体感动作评分后确定的评分结果,作为所述互动结果。In a possible implementation manner, when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can further perform the following steps: acquiring the somatosensory information based on the current user and at least one of the associations The scoring result determined after the somatosensory action is scored on the user's somatosensory information is used as the interaction result.
在一种可能的实现方式中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:将当前用户的体感信息与至少一个预设体感信息进行比对,确定与当前用户的体感信息匹配的目标预设体感信息;将所述目标预设体感信息对应的分数,作为当前用户的评分结果;接收至少一个所述关联用户发送的所述关联用户的评分结果;将所述当前用户的评分结果,以及至少一个所述关联用户的评分结果作为所述互动结果。In a possible implementation manner, when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can further perform the following step: compare the current user's somatosensory information with at least one preset somatosensory information Yes, determine the target preset somatosensory information that matches the somatosensory information of the current user; use the score corresponding to the target preset somatosensory information as the scoring result of the current user; Scoring result; use the current user's scoring result and at least one associated user's scoring result as the interaction result.
在一种可能的实现方式中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理后得到的体感动作图像,作为所述互动结果。In a possible implementation manner, when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can further perform the following steps: acquiring the somatosensory information based on the current user and the information of at least one associated user. The somatosensory action image obtained after the somatosensory information is processed is used as the interaction result.
在一种可能的实现方式中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:通过所述当前用户和关联用户各自客户端之间的游戏信息链接桥,进行用户体感信息和/或互动结果的传输。In a possible implementation manner, when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device is enabled to further perform the following steps: through the game between the current user and the associated user's respective clients An information link bridge, which transmits user somatosensory information and/or interaction results.
在一种可能的实现方式中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:基于所述当前用户的体感信息以及传输的所述关联用户的体感信息进行互动处理,以获取互动结果。In a possible implementation manner, when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can further perform the following steps: based on the somatosensory information of the current user and the transmitted associated user The somatosensory information is interactively processed to obtain interactive results.
在一种可能的实现方式中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:对当前用户的体感信息以及传输的所述关联用户的体感信息进行处理,确定当前用户和所述关联用户的项目信息,其中,所述项目信息包括评分结果和体感动作图像;将当前用户和所述关联用户的项目信息共同作为互动结果。In a possible implementation manner, when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can further perform the following steps: the somatosensory information of the current user and the transmitted somatosensory information of the associated user The information is processed to determine the item information of the current user and the associated user, wherein the item information includes a scoring result and a somatosensory action image; the item information of the current user and the associated user is taken together as an interaction result.
在一种可能的实现方式中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:将所述体感信息上报给服务器;相应的,则获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果包括:获取所述服务器基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果。In a possible implementation manner, when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can further perform the following steps: reporting the somatosensory information to the server; correspondingly, obtaining the information based on The interaction result processed by the somatosensory information of the current user and the somatosensory information of at least one associated user includes: acquiring the interactive result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
在一种可能的实现方式中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:响应于当前用户开启直播间并选择体感项目对决的操作,向服务器发送对手匹配请求;接收所述服务器反馈的与当前用户匹配的至少一个关联用户,以及互动开始指令。In a possible implementation manner, when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device can further perform the following steps: in response to the current user's operation of opening the live room and selecting a somatosensory item duel, Send an opponent matching request to the server; receive at least one associated user matched with the current user fed back by the server, and an interaction start instruction.
在一种可能的实现方式中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:响应于服务器发送的互动开始指令,采集包含用户体感动作的用户画面;将所述用户画面输入至体感信息获取模型,获取所述体感信息获取模型输出的与所述用户体感动作对应的体感信息。In a possible implementation manner, when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device is further able to perform the following steps: in response to the interaction start instruction sent by the server, collect data including the user's somatosensory action. User screen; input the user screen into the somatosensory information acquisition model, and obtain the somatosensory information corresponding to the user's somatosensory action output by the somatosensory information acquisition model.
在一种可能的实现方式中,所述体感信息包含用户头部运动、面部运动和四肢运动中的至少一项运动信息。In a possible implementation manner, the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
根据本公开的一些实施例,提供一种计算机程序产品,用于与电子设备结合使用,所述计算机程序产品包括计算机可读存储介质和内嵌于其中的计算机程序机制,经由计算机载入该程序并执行后能够执行以下步骤:根据采集到的当前用户的用户画面,确定与所述用户画面中的体感动作对应的体感信息;According to some embodiments of the present disclosure, there is provided a computer program product for use in conjunction with an electronic device, the computer program product comprising a computer-readable storage medium and a computer program mechanism embedded therein, the program being loaded via a computer And after the execution, the following steps can be performed: according to the collected user screen of the current user, determine the somatosensory information corresponding to the somatosensory action in the user screen;
获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,其中,所述关联用户是与当前用户匹配的用户;acquiring an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein the associated user is a user matched with the current user;
将所述互动结果结合在当前用户画面所属直播界面中进行展示。The interactive result is displayed in combination with the live interface to which the current user screen belongs.
在一种可能的实现方式中,所述计算机程序经由计算机载入并执行后能够执行以下步骤:获取基于所述当前用户的体感信息以及至少一个所述关联用户的体感信息进行体感动作评分后确定的评分结果,作为所述互动结果。In a possible implementation manner, after the computer program is loaded and executed on a computer, the following steps can be performed: obtaining a somatosensory action score based on the somatosensory information of the current user and the somatosensory information of at least one associated user, and then determining , as the interactive result.
在一种可能的实现方式中,所述计算机程序经由计算机载入并执行后能够执行以下步骤:将当前用户的体感信息与至少一个预设体感信息进行比对,确定与当前用户的体感信息匹配的目标预设体感信息;将所述目标预设体感信息对应的分数,作为当前用户的评分结果;接收至少一个所述关联用户发送的所述关联用户的评分结果;将所述当前用户的评分结果,以及至少一个所述关联用户的评分结果作为所述互动结果。In a possible implementation manner, after the computer program is loaded and executed by a computer, the following steps can be performed: comparing the somatosensory information of the current user with at least one preset somatosensory information, and determining that it matches the somatosensory information of the current user target preset somatosensory information; take the score corresponding to the target preset somatosensory information as the scoring result of the current user; receive the scoring result of the associated user sent by at least one of the associated users; A result, and a rating result of at least one of the associated users are used as the interaction result.
在一种可能的实现方式中,所述计算机程序经由计算机载入并执行后能够执行以下步骤:获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理后得到的体感动作图像,作为所述互动结果。In a possible implementation manner, the computer program can perform the following steps after being loaded and executed by a computer: acquiring a somatosensory action image obtained by processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user , as a result of the interaction.
在一种可能的实现方式中,所述计算机程序经由计算机载入并执行后能够执行以下步骤:通过所述当前用户和关联用户各自客户端之间的游戏信息链接桥,进行用户体感信息 和/或互动结果的传输。In a possible implementation manner, after the computer program is loaded and executed by a computer, the following steps can be performed: through the game information link bridge between the respective clients of the current user and the associated user, perform user somatosensory information and/or or transmission of interactive results.
在一种可能的实现方式中,所述计算机程序经由计算机载入并执行后能够执行以下步骤:基于所述当前用户的体感信息以及传输的所述关联用户的体感信息进行互动处理,以获取互动结果。In a possible implementation manner, after the computer program is loaded and executed by a computer, the following steps can be performed: perform interactive processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user, so as to obtain interactive result.
在一种可能的实现方式中,所述计算机程序经由计算机载入并执行后能够执行以下步骤:对当前用户的体感信息以及传输的所述关联用户的体感信息进行处理,确定当前用户和所述关联用户的项目信息,其中,所述项目信息包括评分结果和体感动作图像;将当前用户和所述关联用户的项目信息共同作为互动结果。In a possible implementation manner, after the computer program is loaded and executed on a computer, the following steps can be performed: process the somatosensory information of the current user and the transmitted somatosensory information of the associated user, determine the current user and the somatosensory information of the associated user. The item information of the associated user, wherein the item information includes the scoring result and the somatosensory action image; the item information of the current user and the associated user is taken together as the interaction result.
在一种可能的实现方式中,所述计算机程序经由计算机载入并执行后能够执行以下步骤:将所述体感信息上报给服务器;相应的,则获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果包括:获取所述服务器基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果。In a possible implementation manner, the computer program can perform the following steps after being loaded and executed by a computer: reporting the somatosensory information to a server; correspondingly, acquiring somatosensory information based on the current user and at least one The interaction result processed by the somatosensory information of the associated user includes: acquiring the interaction result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
在一种可能的实现方式中,所述计算机程序经由计算机载入并执行后能够执行以下步骤:响应于当前用户开启直播间并选择体感项目对决的操作,向服务器发送对手匹配请求;接收所述服务器反馈的与当前用户匹配的至少一个关联用户,以及互动开始指令。In a possible implementation manner, the computer program can perform the following steps after being loaded and executed by the computer: in response to the current user's operation of opening the live room and selecting a somatosensory item duel, sending an opponent matching request to the server; receiving the The server feeds back at least one associated user that matches the current user, and an interaction start instruction.
在一种可能的实现方式中,所述计算机程序经由计算机载入并执行后能够执行以下步骤:响应于服务器发送的互动开始指令,采集包含用户体感动作的用户画面;将所述用户画面输入至体感信息获取模型,获取所述体感信息获取模型输出的与所述用户体感动作对应的体感信息。In a possible implementation manner, after the computer program is loaded and executed on a computer, the following steps can be performed: in response to an interaction start instruction sent by the server, collecting a user image including the user's somatosensory actions; inputting the user image into the A somatosensory information acquisition model, which acquires somatosensory information corresponding to the user's somatosensory action output by the somatosensory information acquisition model.
在一种可能的实现方式中,所述体感信息包含用户头部运动、面部运动和四肢运动中的至少一项运动信息。In a possible implementation manner, the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
本公开实施例通过在直播间进行体感游戏,丰富主播与观众的互动形式,并且可以实现多主播之间的互动,增强直播的互动性,丰富了直播间主播互动的场景。The embodiment of the present disclosure enriches the interactive form between the anchor and the audience by performing a somatosensory game in the live broadcast room, and can realize the interaction between multiple anchors, enhance the interactivity of the live broadcast, and enrich the scene of the interaction between the live broadcasters.
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the present disclosure.
附图说明Description of drawings
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理,并不构成对本公开的不当限定。The accompanying drawings, which are incorporated into and constitute a part of this specification, illustrate embodiments consistent with the present disclosure, and together with the description, serve to explain the principles of the present disclosure and do not unduly limit the present disclosure.
图1是根据一示例性实施例示出的一种直播体感项目互动方法的流程图。Fig. 1 is a flow chart of a method for interacting with a live broadcast somatosensory item according to an exemplary embodiment.
图2a是根据一示例性实施例示出的一种直播体感项目互动方法的流程图。Fig. 2a is a flow chart of a method for interacting with a live somatosensory item according to an exemplary embodiment.
图2b是根据一示例性实施例示出的一种直播体感项目互动示意图。Fig. 2b is a schematic diagram showing the interaction of a live somatosensory item according to an exemplary embodiment.
图3a是根据一示例性实施例示出的一种直播体感项目互动方法的流程图。Fig. 3a is a flow chart of a method for interacting with a live broadcast somatosensory item according to an exemplary embodiment.
图3b是根据一示例性实施例示出的一种通过游戏信息连接桥进行互动结果传输的示意图。Fig. 3b is a schematic diagram showing interactive result transmission through a game information connection bridge according to an exemplary embodiment.
图4a是根据一示例性实施例示出的一种直播体感项目互动方法的流程图。Fig. 4a is a flow chart of a method for interacting with a live somatosensory item according to an exemplary embodiment.
图4b是根据一示例性实施例示出的一种直播体感项目对手匹配示意图。Fig. 4b is a schematic diagram showing opponent matching of a live somatosensory project according to an exemplary embodiment.
图5是根据一示例性实施例示出的一种直播体感项目互动装置框图。Fig. 5 is a block diagram of an interactive device for a live somatosensory item according to an exemplary embodiment.
图6是根据一示例性实施例示出的一种电子设备的结构示意图。FIG. 6 is a schematic structural diagram of an electronic device according to an exemplary embodiment.
具体实施方式detailed description
为了使本领域普通人员更好地理解本公开的技术方案,下面将结合附图,对本公开实施例中的技术方案进行清楚、完整地描述。In order to make those skilled in the art better understand the technical solutions of the present disclosure, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
需要说明的是,本公开的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本公开的实施例能够以除了在这里图示或描述的那些以外的顺序实施。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。需要说明的是,本公开的所有实施例均可以单独被执行,也可以与其他实施例相结合共同被执行,本公开对此不做限制。It should be noted that the terms "first", "second" and the like in the description and claims of the present disclosure and the above drawings are used to distinguish similar objects, and are not necessarily used to describe a specific sequence or sequence. It is to be understood that the data so used may be interchanged under appropriate circumstances so that the embodiments of the disclosure described herein can be practiced in sequences other than those illustrated or described herein. The implementations described in the illustrative examples below are not intended to represent all implementations consistent with this disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as recited in the appended claims. It should be noted that, all the embodiments of the present disclosure may be executed independently or in combination with other embodiments, which are not limited in the present disclosure.
现有的直播间使用魔法表情时,通常都是在用户头部叠加一些配饰或者卡通人物等,这虽然增加了主播内容的丰富性和趣味性,但仍仅限于视觉效果的丰富,并没有增强直播的互动性,也没有多个用户之间的互动,本公开实施例可以适用于在直播间以直播体感项目的形式进行互动的情况。本公开实施例根据采集到的当前用户的用户画面,确定与用户画面中的体感动作对应的体感信息,然后获取基于当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,其中,关联用户是与当前用户匹配的用户,最终将互动结果结合在当前用户画面所属直播界面中进行展示,丰富用户之间互动形式,并且可以实现多用户之间的互动,丰富了直播间用户互动的场景。When using magic expressions in the existing live broadcast rooms, they usually superimpose some accessories or cartoon characters on the user's head. Although this increases the richness and interest of the anchor's content, it is still limited to the richness of visual effects and does not enhance The live broadcast is interactive, and there is no interaction between multiple users, and the embodiments of the present disclosure may be applicable to the case of interacting in the form of live broadcast somatosensory items in the live broadcast room. The embodiment of the present disclosure determines the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user, and then obtains the interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein , the associated user is the user who matches the current user, and finally the interaction results are displayed in the live broadcast interface to which the current user screen belongs, which enriches the interaction between users, and can realize the interaction between multiple users, which enriches the user interaction in the live broadcast room. scene.
图1是根据一示例性实施例示出的一种直播体感项目互动方法的流程图,如图1所示,直播体感项目互动方法用于电子设备中,由配置于电子设备中的处理器来执行,该方法包 括以下步骤。Fig. 1 is a flow chart of a method for interacting with live somatosensory items according to an exemplary embodiment. As shown in Fig. 1 , the method for interacting with live somatosensory items is used in an electronic device and executed by a processor configured in the electronic device , the method includes the following steps.
S11,根据采集到的当前用户的用户画面,确定与用户画面中的体感动作对应的体感信息。S11: Determine the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user.
其中,用户画面是客户端采集的包含用户体感动作的画面,用于获取用户的体感信息,例如,用户画面可以是客户端摄像头采集的用户眨眼或者摇头时的画面;体感信息是从用户的头部、面部或者肢体等部位的体感动作提取出的能够表征当前用户体感动作的信息,例如,体感信息包括摇头动作,摇头的幅度,以及摇头的频率等能够表征用户当前体感动作的信息。The user picture is a picture collected by the client that contains the user's somatosensory actions, and is used to obtain the user's somatosensory information. For example, the user picture may be the picture collected by the client's camera when the user blinks or shakes his head; the somatosensory information is obtained from the user's head Information that can characterize the current user's somatosensory action extracted from the somatosensory movement of the body, face, or limbs, for example, the somatosensory information includes shaking head movement, the amplitude of shaking the head, and the frequency of shaking the head and other information that can characterize the user's current somatosensory movement.
本公开实施例中,客户端通过摄像头采集当前用户的用户画面,然后对用户画面进行识别,提取用户画面中的体感动作的特征,并由至少一个体感动作特征构成用户的体感信息。例如,客户端前置摄像头实时采集当前用户的用户画面,并对用户画面进行图像识别,获取当前用户的体感动作为眨眼,眨眼频率为每秒2次,则最终可以将该“眨眼”,以及“每秒2次”等信息封装为与用户画面中的体感动作对应的体感信息。In the embodiment of the present disclosure, the client collects the user picture of the current user through the camera, then identifies the user picture, extracts the features of the somatosensory action in the user picture, and constitutes the user's somatosensory information by at least one somatosensory action feature. For example, the front camera of the client collects the current user's user screen in real time, performs image recognition on the user screen, and obtains the current user's body movement as a blink. Information such as "twice per second" is encapsulated as somatosensory information corresponding to somatosensory actions on the user screen.
在本公开实施例的一个实施方式中,体感信息包含用户头部运动、面部运动和四肢运动中的至少一项运动信息。In one implementation of the embodiment of the present disclosure, the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
在上述可能的实施例中,体感信息包含用户头部运动、面部运动和四肢运动中的至少一项运动信息,例如,体感信息可以是左右摇头或者点头,以及摇头或者点头的幅度和频率,还可以是用户张嘴、眨眼等面部运动,以及眨眼频率等运动特征,也可以是用户抬起手臂的动作以及该动作的特征,例如抬起手臂的幅度,以及大臂与身体夹角等。In the above possible embodiments, the somatosensory information includes at least one motion information of the user's head movement, facial movement, and limb movement. For example, the somatosensory information may be shaking his head left and right or nodding, and the amplitude and frequency of shaking his head or nodding, and also It can be facial movements such as the user's mouth opening and blinking, and motion features such as eye blinking frequency, or the user's action of raising an arm and the characteristics of the action, such as the magnitude of the raised arm and the angle between the arm and the body.
S12,获取基于当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,其中,关联用户是与当前用户匹配的用户。S12: Acquire an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, where the associated user is a user matched with the current user.
其中,互动结果是对当前用户以及与至少一个关联用户对应的体感信息分别进行处理,并将处理结果进行互动操作得到的结果,而互动操作可以是将各用户对应的处理结果进行比对、计算或者简单叠加,例如,可以针对各用户的体感信息,进行体感动作评分,并将各用户的评分结果的集合作为互动结果。The interaction result is the result obtained by separately processing the current user and the somatosensory information corresponding to at least one associated user, and performing an interactive operation on the processing result, and the interactive operation may be comparing and calculating the processing results corresponding to each user. Or simply superimposed, for example, a somatosensory action score may be performed with respect to the somatosensory information of each user, and the set of scoring results of each user may be used as the interaction result.
本公开实施例中,为了增加用户之间的互动场景,客户端获取当前用户和至少一个关联用户的互动结果。该互动结果可以是客户端对当前用户以及至少一个关联用户的体感信息进行互动处理得到的,也可以是对当前用户的体感信息进行处理,并接收至少一个关联用户针对各自体感信息进行处理的结果,最终将各用户对于体感信息的处理结果共同作为互动结果,当然,还可以是当前用户以及各关联用户将各自体感信息上报至服务器,最终 从服务器端获取基于当前用户的体感信息以及至少一个关联用户的体感信息进行处理得到互动结果。In the embodiment of the present disclosure, in order to increase the interaction scene between users, the client obtains the interaction result between the current user and at least one associated user. The interactive result may be obtained by the client performing interactive processing on the somatosensory information of the current user and at least one associated user, or may be obtained by processing the somatosensory information of the current user and receiving the result of processing the respective somatosensory information by at least one associated user , and finally the processing results of each user's somatosensory information are used as the interactive result. Of course, the current user and each associated user can also report their somatosensory information to the server, and finally obtain the somatosensory information based on the current user and at least one association from the server. The user's somatosensory information is processed to obtain an interactive result.
在本公开实施例的一个具体实现方式中,客户端针对当前用户的体感信息,按照设定的项目规则,对当前用户的体感动作进行评分,得到相应评分结果,例如,85分,同时接收与当前用户匹配的关联用户发送的评分结果为80分,最终可以将各用户的体感动作评分共同作为互动结果,在这里互动方式可以为对各用户评分进行比对,将评分最高的用户作为项目胜利者。In a specific implementation of the embodiment of the present disclosure, the client, according to the somatosensory information of the current user, according to the set project rules, scores the somatosensory actions of the current user, and obtains a corresponding scoring result, for example, 85 points, and simultaneously receives and The rating result sent by the associated user matched by the current user is 80 points. In the end, the somatosensory action rating of each user can be used as the interactive result. Here, the interaction method can be to compare the ratings of each user, and the user with the highest rating will win the project. By.
S13,将互动结果结合在当前用户画面所属直播界面中进行展示。S13, combine the interaction result in the live broadcast interface to which the current user screen belongs to be displayed.
本公开实施例中,在获取到基于当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果后,在当前用户的用户画面所属直播界面,将互动结果进行展示。例如,互动结果为当前用户和至少一个关联用户的项目评分,则在当前用户画面中同时展示当前用户和关联用户的项目评分,也可以根据各用户的项目评分比对结果,在评分最高的用户对于位置显示“Winner”字样。In the embodiment of the present disclosure, after obtaining the interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, the interactive result is displayed on the live interface to which the current user's user screen belongs. For example, if the interaction result is the item ratings of the current user and at least one associated user, the item ratings of the current user and associated users are displayed on the current user screen at the same time. The word "Winner" is displayed for the location.
本公开实施例的技术方案,电子设备可以确定与用户画面中的体感动作对应的体感信息,获取基于当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,最终将互动结果结合在当前用户画面所属直播界面中进行展示,从而实现在直播间以体感项目的形式进行互动,由此,本公开丰富了用户之间互动形式,并且实现多用户之间的互动,丰富了直播间用户互动的场景。According to the technical solutions of the embodiments of the present disclosure, the electronic device can determine the somatosensory information corresponding to the somatosensory action in the user screen, obtain the interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, and finally combine the interaction results It is displayed in the live interface to which the current user screen belongs, so as to realize the interaction in the form of somatosensory items in the live broadcast room. Therefore, the present disclosure enriches the interaction form between users, realizes the interaction between multiple users, and enriches the live broadcast room. User interaction scenarios.
图2a是根据一示例性实施例示出的一种直播体感项目互动方法的流程图,本公开实施例是对上述技术方案的细化,本公开实施例中的技术方案可以与上述一个或者多个实施例中的各个可选方案结合。如图2a所示,直播体感项目互动方法包括如下步骤。Fig. 2a is a flowchart illustrating a method for interacting with a live somatosensory item according to an exemplary embodiment. The embodiment of the present disclosure is a refinement of the above technical solution. The technical solution in the embodiment of the present disclosure may be combined with one or more of the above-mentioned technical solutions. The various alternatives in the embodiments are combined. As shown in Fig. 2a, the interactive method of the live somatosensory item includes the following steps.
S21,根据采集到的当前用户的用户画面,确定与用户画面中的体感动作对应的体感信息。S21 , according to the collected user screen of the current user, determine somatosensory information corresponding to the somatosensory action in the user screen.
S22,获取基于当前用户的体感信息以及至少一个关联用户的体感信息进行体感动作评分后确定的评分结果,作为互动结果。S22: Acquire a scoring result determined by scoring a somatosensory action based on the somatosensory information of the current user and the somatosensory information of at least one associated user, as an interaction result.
本公开实施例中,提供一种获取互动结果的方式,分别获取基于当前用户的体感信息进行体感动作评分后确定的评分结果,以及基于至少一个关联用户的体感信息进行体感动作评分后确定的评分结果,将各用户的评分结果共同作为互动结果。获取基于各用户体感信息的评分结果,可以是针对当前用户的体感信息,根据直播体感项目的评分规则,为当 前用户的体感动作进行评分,获取当前用户的评分结果,然后接收至少一个关联用户发送的各自的评分结果,最终将当前用户和至少一个关联用户的评分结果的集合作为互动结果;当然,获取基于各用户体感信息的评分结果,也可以是获取当前用户体感信息的同时,接收至少一个关联用户发送的与各关联用户匹配的体感信息,然后根据直播体感项目的评分规则,对各用户的体感动作进行评分,最终将获取到的各用户评分结果的集合作为互动结果。In the embodiment of the present disclosure, a method for obtaining an interaction result is provided, respectively obtaining a scoring result determined after performing a somatosensory action score based on the somatosensory information of the current user, and a score determined after performing a somatosensory action scoring based on the somatosensory information of at least one associated user. As a result, the rating results of each user are collectively used as the interactive result. Obtain the scoring result based on the somatosensory information of each user, which can be for the somatosensory information of the current user, according to the scoring rules of the live broadcast somatosensory item, to score the somatosensory actions of the current user, obtain the scoring result of the current user, and then receive at least one associated user. The respective scoring results of the current user and at least one associated user are finally used as the interactive result; of course, to obtain the scoring results based on the somatosensory information of each user, it is also possible to obtain the somatosensory information of the current user while receiving at least one The somatosensory information sent by the associated user that matches each associated user, and then according to the scoring rules of the live somatosensory item, the somatosensory actions of each user are scored, and finally the set of obtained scoring results of each user is used as the interactive result.
在本公开实施例的一个具体实现方式中,直播体感项目具体操作方式为通过眨眼控制屏幕上显示卡通人物的跳动,具体如图2b所示,评分规则是眨眼频率越快,卡通人物跳动越快,具体为,每眨眼一次,卡通人物向前跳动一次,获得1分。响应于客户端获取到当前用户的体感信息为,以每秒2次的频率眨眼,则根据用户眨眼次数为用户体感动作进行评分,在项目开始15秒后,统计得到当前用户评分结果为30分,同时客户端会接收关联用户实时发送的评分结果,在项目开始15秒后,关联用户发送的评分结果为25分,则将该时刻当前用户以及关联用户的评分共同作为互动结果。In a specific implementation of the embodiment of the present disclosure, the specific operation mode of the live broadcast somatosensory item is to control the beating of the cartoon characters displayed on the screen by blinking. , specifically, for every blink, the cartoon character jumps forward once, and gets 1 point. In response to the client's acquisition of the current user's somatosensory information, blinking at a frequency of 2 times per second, the user's somatosensory actions are scored according to the number of blinks of the user's eyes. 15 seconds after the project starts, the current user's score is 30 points. , and the client will receive the real-time scoring result sent by the associated user. 15 seconds after the project starts, if the scoring result sent by the associated user is 25 points, the current user and associated user's rating at that moment will be used as the interactive result.
在本公开实施例的一个实施方式中,获取基于所述当前用户的体感信息以及至少一个所述关联用户的体感信息进行体感动作评分后确定的评分结果,作为所述互动结果包括:In an implementation of the embodiment of the present disclosure, obtaining a scoring result determined after performing somatosensory action scoring based on the somatosensory information of the current user and the somatosensory information of at least one associated user, as the interaction result includes:
将当前用户的体感信息与至少一个预设体感信息进行比对,确定与当前用户的体感信息匹配的目标预设体感信息;Compare the somatosensory information of the current user with at least one preset somatosensory information, and determine the target preset somatosensory information that matches the somatosensory information of the current user;
将所述目标预设体感信息对应的分数,作为当前用户的评分结果;Taking the score corresponding to the preset somatosensory information of the target as the scoring result of the current user;
接收至少一个所述关联用户发送的所述关联用户的评分结果;receiving the scoring result of the associated user sent by at least one of the associated users;
将所述当前用户的评分结果,以及至少一个所述关联用户的评分结果作为所述互动结果。The rating result of the current user and the rating result of at least one of the associated users are used as the interaction result.
上述可选的实施例中,提供了一种获取基于当前用户的体感信息以及至少一个关联用户的体感信息进行体感动作评分后确定的评分结果,作为互动结果的具体方式,首先将当前用户体感信息与多个预设体感信息进行比对,响应于当前用户体感信息与某一个预设体感信息匹配,则将该预设体感信息作为目标预设体感信息,然后根据预设体感信息与分数的对应关系,确定目标预设体感信息对应的分数,最终将目标预设体感信息对应分数作为当前用户的评分结果,与此同时,接收至少一个关联用户发送的关联用户的评分结果,其中,各关联用户的评分结果是在各关联用户对应客户端通过处理各关联用户的体感信息获取并发送至当前客户端的,最终将当前用户以及至少一个关联用户的评分结果共同作为互动结果。In the above-mentioned optional embodiment, there is provided a scoring result determined after obtaining a somatosensory action score based on the somatosensory information of the current user and the somatosensory information of at least one associated user. Comparing with multiple preset somatosensory information, in response to the current user somatosensory information matching a certain preset somatosensory information, the preset somatosensory information is used as the target preset somatosensory information, and then according to the correspondence between the preset somatosensory information and the score relationship, determine the score corresponding to the target preset somatosensory information, and finally use the corresponding score of the target preset somatosensory information as the scoring result of the current user, and at the same time, receive the scoring result of the associated user sent by at least one associated user, wherein each associated user The scoring result is obtained by processing the somatosensory information of each associated user at the corresponding client of each associated user and sent to the current client, and finally the scoring result of the current user and at least one associated user is used as the interactive result.
例如,直播体感项目的具体操作方式为头部跟随客户端屏幕显示的标记位置左右摆动,并在随机时刻进行拍照,根据拍照时刻用户头部位置与标记位置的匹配度进行评分,匹配度越高,用户体感动作的评分越高,该直播体感项目预先设定了各预设体感信息(对应多个预设位置)与评分的对应关系,获取到的当前用户的体感信息包含用户头部位置信息和标记位置信息,将用户体感信息和多个预设体感信息进行比对,将与当前用户体感信息匹配的预设体感信息作为目标预设体感信息,并将目标预设体感信息对应的评分作为当前用户的评分结果。For example, the specific operation method of the live somatosensory project is to swing the head left and right following the marked position displayed on the client screen, and take pictures at random moments, and score according to the matching degree between the user's head position and the marked position at the time of taking the photo, the higher the matching degree , the higher the score of the user's somatosensory action, the corresponding relationship between each preset somatosensory information (corresponding to multiple preset positions) and the score is preset in the live somatosensory project, and the obtained somatosensory information of the current user includes the user's head position information and mark location information, compare the user somatosensory information with multiple preset somatosensory information, take the preset somatosensory information that matches the current user somatosensory information as the target preset somatosensory information, and use the score corresponding to the target preset somatosensory information as the target preset somatosensory information. The current user's rating results.
在本公开实施例的另一个具体例子中,确定与用户画面中的体感动作对应的体感信息后,还可以获取基于当前用户的体感信息以及至少一个关联用户的体感信息进行处理后得到的体感动作图像,作为互动结果。In another specific example of the embodiment of the present disclosure, after the somatosensory information corresponding to the somatosensory action in the user screen is determined, the somatosensory action obtained after processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user may also be obtained image, as a result of interaction.
其中,体感动作图像是基于用户的体感信息获取的包含用户体感动作以及与体感信息相关的其他信息的图像,例如,体感动作图像可以是包含当前用户体感动作,以及在用户体感动作上叠加魔法表情得到的图像。The somatosensory action image is an image obtained based on the user's somatosensory information and includes the user's somatosensory action and other information related to the somatosensory information. For example, the somatosensory action image may include the current user's somatosensory action, and superimpose a magic expression on the user's somatosensory action. obtained image.
本公开实施例中,当前客户端可以对获取到的当前用户的体感信息进行处理得到体感动作图像,并对接收到的至少一个关联用户发送的体感动作进行处理得到与关联用户匹配的体感动作图像,最终将当前用户以及至少一个关联用户的体感动作图像作为互动结果;当前客户端也可以仅对当前用户的体感动作进行处理得到体感动作图像,然后接收至少一个关联用户发送的体感动作图像,并将当前用户以及至少一个关联用户的体感动作图像作为互动结果。In this embodiment of the present disclosure, the current client can process the acquired somatosensory information of the current user to obtain a somatosensory action image, and process the received somatosensory action sent by at least one associated user to obtain a somatosensory action image that matches the associated user , and finally use the somatosensory action image of the current user and at least one associated user as the interaction result; the current client can also process only the somatosensory action of the current user to obtain the somatosensory action image, and then receive the somatosensory action image sent by at least one associated user, and The somatosensory motion image of the current user and at least one associated user is used as the interaction result.
S23,将互动结果结合在当前用户画面所属直播界面中进行展示。S23, combine the interaction result in the live broadcast interface to which the current user screen belongs to be displayed.
本公开实施例的技术方案,根据采集到的当前用户的用户画面,确定与用户画面中的体感动作对应的体感信息,获取基于当前用户的体感信息以及至少一个关联用户的体感信息进行体感动作评分后确定的评分结果作为互动结果,最终将互动结果结合在当前用户画面所属直播界面中进行展示,可以在直播间以体感项目的形式进行互动,最终展示多个用户的评分,丰富了直播间用户互动的场景。The technical solution of the embodiment of the present disclosure is to determine the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user, and obtain the somatosensory action score based on the somatosensory information of the current user and the somatosensory information of at least one associated user. The rating results determined later are used as the interactive results, and the interactive results are finally displayed in the live broadcast interface to which the current user screen belongs. You can interact in the form of somatosensory items in the live broadcast room, and finally display the scores of multiple users, enriching the users of the live broadcast room. interactive scene.
图3a是根据一示例性实施例示出的一种直播体感项目互动方法的流程图,本公开实施例是对上述技术方案的细化,本公开实施例中的技术方案可以与上述一个或者多个实施例中的各个可选方案结合。如图3a所示,直播体感项目互动方法包括如下步骤。Fig. 3a is a flowchart illustrating a method for interacting with a live somatosensory item according to an exemplary embodiment. The embodiment of the present disclosure is a refinement of the above technical solution. The technical solution in the embodiment of the present disclosure may be combined with one or more of the above-mentioned technical solutions. The various alternatives in the embodiments are combined. As shown in Fig. 3a, the interactive method for live broadcast somatosensory items includes the following steps.
S31,根据采集到的当前用户的用户画面,确定与用户画面中的体感动作对应的体感 信息。S31: Determine the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user.
S32,通过当前用户和关联用户各自客户端之间的游戏信息链接桥,进行用户体感信息和/或互动结果的传输。S32 , transmit the user somatosensory information and/or the interaction result through the game information link bridge between the current user and the associated user's respective clients.
其中,游戏信息链接桥(Game Information Bridge,GIB)技术是基于UDP协议扩展衍生的数据交换服务技术,该技术采用UDP协议进行数据广播,并且可以指定数据广播的类型,如可丢包、抗丢包,并根据数据广播类型,会做相应的处理,比如抗丢包类型的数据,接收端收到会发送相应的应答数据包(Acknowledge character,ACK),响应于发送端没有收到ACK包,会进行重发等行为。游戏信息链接桥服务具有可靠、低延时等特性。Among them, the Game Information Bridge (GIB) technology is a data exchange service technology based on the extension of the UDP protocol. Packet, and according to the type of data broadcast, will do corresponding processing, such as anti-packet loss type data, the receiver will send the corresponding response data packet (Acknowledge character, ACK), in response to the sender does not receive the ACK packet, Retransmission and other behaviors will be performed. The game information link bridge service has the characteristics of reliability and low latency.
本公开实施例中,通过当前用户和关联用户各自客户端之间的游戏信息链接桥进行信息的传输,传输信息可以是用户体感信息和/或互动结果。例如,当前客户端可以通过游戏信息链接桥将当前用户的用户体感信息发送至至少一个关联用户,同时通过游戏信息连接桥接收至少一个关联用户发送的用户体感信息,最终通过对各用户的用户体感信息进行处理得到互动结果;当前客户端也可以直接通过游戏连接桥将进行互动结果的传输,具体如图3b所示,将当前用户的互动结果,例如,评分结果,发送至至少一个关联用户,同时通过游戏信息链接桥,接收至少一个关联用户发送的与各关联用户对应的评分结果,最终将各用户的评分结果作为最终的互动结果。In the embodiment of the present disclosure, information is transmitted through the game information link bridge between the respective clients of the current user and the associated user, and the transmitted information may be user somatosensory information and/or interaction results. For example, the current client can send the current user's somatosensory information to at least one associated user through the game information link bridge, and at the same time receive the user somatosensory information sent by at least one associated user through the game information link bridge. The information is processed to obtain the interaction result; the current client can also directly transmit the interaction result through the game connection bridge, as shown in Figure 3b, the interaction result of the current user, such as the rating result, is sent to at least one associated user, At the same time, through the game information link bridge, the scoring result corresponding to each associated user sent by at least one associated user is received, and the scoring result of each user is finally used as the final interactive result.
S33,基于当前用户的体感信息以及传输的所述关联用户的体感信息进行互动处理,以获取互动结果。S33: Perform interactive processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain an interactive result.
本公开实施例中,当前客户端通过游戏信息链接桥进行用户体感信息传输之后,对当前用户的体感信息以及关联用户的体感信息进行互动处理,得到最终的互动结果。例如,客户端根据当前用户和关联用户的用户体感信息,采用预设的直播体感项目评分规则,计算当前用户以及关联用户的体感动作的得分,将全部用户的评分结果作为互动结果。In the embodiment of the present disclosure, after the current client transmits the user's somatosensory information through the game information link bridge, it performs interactive processing on the somatosensory information of the current user and the somatosensory information of the associated user to obtain a final interactive result. For example, according to the user's somatosensory information of the current user and associated users, the client uses preset scoring rules for live-streaming somatosensory items to calculate the scores of the current user and associated users' somatosensory actions, and uses the scoring results of all users as the interactive results.
在本公开实施例的一个实施方式中,基于所述当前用户的体感信息以及传输的所述关联用户的体感信息进行处理,以获取互动结果包括:In an implementation manner of the embodiment of the present disclosure, processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain the interaction result includes:
对当前用户的体感信息以及传输的所述关联用户的体感信息进行处理,确定当前用户和所述关联用户的项目信息,其中,所述项目信息包括评分结果和体感动作图像;Process the somatosensory information of the current user and the transmitted somatosensory information of the associated user, and determine the item information of the current user and the associated user, wherein the item information includes a scoring result and a somatosensory action image;
将当前用户和所述关联用户的项目信息共同作为互动结果。The item information of the current user and the associated user is taken together as an interaction result.
上述可选的实施例中,提供一种基于当前用户的体感信息以及传输的关联用户的体感信息进行处理,获取互动结果的具体方式,首先,对当前用户的体感信息和关联用户通过 游戏信息链接桥传送的关联用户的体感信息进行处理,例如,根据体感信息为各用户体感动作进行评分,或者根据体感信息控制用户端显示的体感动作图像,最终得到当前用户和关联用户的项目信息,将各用户的项目信息共同作为互动结果,其中项目信息可以包括评分结果和体感动作图像。In the above-mentioned optional embodiment, a specific method for processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain the interactive result is provided. First, the somatosensory information of the current user and the associated user are linked through the game information. The somatosensory information of the associated user transmitted by the bridge is processed, for example, the somatosensory action of each user is scored according to the somatosensory information, or the somatosensory action image displayed on the client terminal is controlled according to the somatosensory information, and finally the item information of the current user and the associated user is obtained, and the The user's item information is collectively used as an interaction result, wherein the item information may include scoring results and somatosensory action images.
在一种可能的实现方式中,确定与所述用户画面中的体感动作对应的体感信息之后还包括:In a possible implementation manner, after determining the somatosensory information corresponding to the somatosensory action in the user screen, the method further includes:
将所述体感信息上报给服务器;reporting the somatosensory information to the server;
相应的,则获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果包括:Correspondingly, acquiring the interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user includes:
获取所述服务器基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果。Obtain an interaction result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
上述可选的实施例中,还提供一种获取基于当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果的具体方式,首先在获取到用户的体感信息之后,将用户的体感信息上报至服务器,由服务器对当前用户上报的体感信息,以及至少一个关联用户上报的体感信息进行处理,得到互动结果并反馈至各用户对应的客户端。例如,服务器根据预设的评分规则,对各用户上报的体感信息进行评分,得到各用户的评分结果,并将各用户的评分结果反馈至各用户。In the above-mentioned optional embodiment, a specific method for obtaining an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user is also provided. The information is reported to the server, and the server processes the somatosensory information reported by the current user and the somatosensory information reported by at least one associated user to obtain interactive results and feed them back to the client corresponding to each user. For example, the server scores the somatosensory information reported by each user according to a preset scoring rule, obtains the scoring result of each user, and feeds back the scoring result of each user to each user.
S34,将互动结果结合在当前用户画面所属直播界面中进行展示。S34, combine the interaction result in the live broadcast interface to which the current user screen belongs to be displayed.
本公开实施例的技术方案,根据采集到的当前用户的用户画面,确定与用户画面中的体感动作对应的体感信息,通过当前用户和关联用户各自客户端之间的游戏信息链接桥,进行用户体感信息和/或互动结果的传输,进行用户体感信息传输之后,基于当前用户的体感信息以及传输的关联用户的体感信息进行互动处理,以获取互动结果,最终将互动结果结合在当前用户画面所属直播界面中进行展示,可以在直播间以体感项目的形式进行互动,丰富了直播间用户互动的场景。According to the technical solutions of the embodiments of the present disclosure, the somatosensory information corresponding to the somatosensory actions in the user screen is determined according to the collected user screen of the current user, and through the game information link bridge between the current user and the associated user's respective clients, the user Transmission of somatosensory information and/or interaction results, after the user somatosensory information is transmitted, interactive processing is performed based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain the interactive result, and finally the interaction result is combined with the current user's screen For display in the live broadcast interface, you can interact in the form of somatosensory items in the live broadcast room, which enriches the user interaction scene in the live broadcast room.
图4a是根据一示例性实施例示出的一种直播体感项目互动方法的流程图,本公开实施例是对上述技术方案的细化,本公开实施例中的技术方案可以与上述一个或者多个实施例中的各个可选方案结合。如图4a所示,直播体感项目互动方法包括如下步骤。Fig. 4a is a flowchart showing a method for interacting with a live somatosensory item according to an exemplary embodiment. The embodiment of the present disclosure is a refinement of the above technical solution. The technical solution in the embodiment of the present disclosure may be combined with one or more of the above-mentioned technical solutions. The various alternatives in the embodiments are combined. As shown in Fig. 4a, the interactive method of the live somatosensory item includes the following steps.
S41,响应于当前用户开启直播间并选择体感项目对决的操作,向服务器发送对手匹配请求。S41 , in response to the current user's operation of opening the live broadcast room and selecting a somatosensory item to compete, sending an opponent matching request to the server.
本公开实施例中,直播体感项目是多个用户互动形式,因此,在用户开启直播间并选择体感项目对决选项后,客户端向服务器发送对手匹配请求,其中,对手匹配请求中包含当前用户的ID,以及当前用户选择的体感项目的标识信息。服务器可以根据对手匹配请求中包含的体感项目标识信息确定当前用户选择的体感项目,然后获取选择该体感项目的全部用户,在全部用户中随机选择一个或多个用户与当前用户匹配,选择的数量与用户选择的体感项目对决方式有关,响应于用户选择了双人对决,则只需选择一个用户与当前用户匹配,响应于用户选择了多人对决,则选择多个用户与当前用户匹配。在客户端接收到服务器反馈的关联用户之前,显示屏向用户展示“正在匹配”字样,如图4b所示,这种方式可以使用户获取当前项目的加载情况。In the embodiment of the present disclosure, the live broadcast somatosensory item is in the form of multiple user interactions. Therefore, after the user opens the live broadcast room and selects the somatosensory item duel option, the client sends an opponent matching request to the server, wherein the opponent matching request includes the current user's ID, and identification information of the somatosensory item currently selected by the user. The server may determine the somatosensory item selected by the current user according to the somatosensory item identification information contained in the opponent matching request, then obtain all users who have selected the somatosensory item, and randomly select one or more users from all users to match the current user, and the number of selected It is related to the somatosensory item confrontation mode selected by the user. In response to the user selecting a two-person confrontation, only one user needs to be selected to match the current user, and in response to the user selecting a multi-person confrontation, multiple users are selected to be matched with the current user. Before the client receives the associated user fed back by the server, the display screen displays the word "matching" to the user, as shown in Figure 4b, this method enables the user to obtain the loading status of the current item.
S42,接收服务器反馈的与当前用户匹配的至少一个关联用户,以及互动开始指令。S42: Receive at least one associated user fed back by the server that matches the current user, and an interaction start instruction.
本公开实施例中,客户端接收服务器针对对手匹配请求反馈的与当前用户匹配的至少一个关联用户,同时接收服务器反馈的互动开始指令,以响应指令,开始直播体感项目。例如,客户端接收服务器反馈的至少一个关联用户的ID,确定与当前用户对决的其他一个或多个用户,并在屏幕相应位置展示关联用户的相关信息,同时接收服务器反馈的直播体感项目开始倒计时,并在屏幕展示倒计时字样,以提示用户开始直播体感项目。In the embodiment of the present disclosure, the client terminal receives at least one associated user matched with the current user fed back by the server for the opponent matching request, and simultaneously receives the interaction start instruction fed back by the server, and starts the live broadcast of the somatosensory item in response to the instruction. For example, the client receives the ID of at least one associated user fed back by the server, determines one or more other users that compete with the current user, displays the relevant information of the associated user in the corresponding position on the screen, and starts the countdown after receiving the live somatosensory item fed back by the server. , and a countdown is displayed on the screen to prompt the user to start the live broadcast of the somatosensory project.
S43,响应于服务器发送的互动开始指令,采集包含用户体感动作的用户画面。S43, in response to the interaction start instruction sent by the server, collect a user screen including the user's somatosensory action.
本公开实施例中,在服务器发送互动开始指令后,客户端开始采集当前用户的用户画面,该用户画面中包含当前用户的体感动作,用于根据用户体感动作确定多个用户之间的互动结果。In the embodiment of the present disclosure, after the server sends the interaction start instruction, the client starts to collect the user screen of the current user, and the user screen contains the somatosensory actions of the current user, which are used to determine the interaction results between multiple users according to the somatosensory actions of the user .
S44,将用户画面输入至体感信息获取模型,获取体感信息获取模型输出的与用户体感动作对应的体感信息。S44 , input the user screen into the somatosensory information acquisition model, and acquire the somatosensory information corresponding to the user's somatosensory action output by the somatosensory information acquisition model.
本公开实施例中,提供了一种根据用户画面确定与用户画面中的体感动作对应的体感信息的实现方式,将采集的用户画面输入至体感信息获取模型,最终将体感信息获取模型输入的体感信息作为与当前用户的体感动作对应的体感信息。其中,体感信息获取模型可以是能够对输入的用户画面进行识别,获取用户体感动作特征的模型,例如,当用户摇头时,可以确定用户的体感动类型为摇头,以及获取摇头的幅度和频率。In the embodiments of the present disclosure, an implementation manner of determining the somatosensory information corresponding to the somatosensory action in the user screen according to the user screen is provided, the collected user screen is input into the somatosensory information acquisition model, and finally the somatosensory information input by the somatosensory information acquisition model is input. The information is the somatosensory information corresponding to the current user's somatosensory action. The somatosensory information acquisition model may be a model capable of recognizing the input user screen and obtaining the user's somatosensory motion characteristics. For example, when the user shakes his head, the user's somatosensory motion type can be determined as shaking his head, and the amplitude and frequency of shaking his head can be obtained.
S45,获取基于当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,其中,关联用户是与当前用户匹配的用户。S45: Acquire an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, where the associated user is a user matched with the current user.
S46,将互动结果结合在当前用户画面所属直播界面中进行展示。S46, combine the interaction result in the live broadcast interface to which the current user screen belongs to be displayed.
本公开实施例的技术方案,响应于当前用户开启直播间并选择体感项目对决的操作, 向服务器发送对手匹配请求,然后响应于服务器发送的互动开始指令,开始采集包含用户体感动作的用户画面,并将用户画面输入至体感信息获取模型,获取体感信息获取模型输出的与用户体感动作对应的体感信息,最终获取基于当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,将互动结果结合在当前用户的用户画面所属直播界面中进行展示,可以在直播间以体感项目的形式进行互动,丰富了直播间用户互动的场景。According to the technical solution of the embodiment of the present disclosure, in response to the current user's operation of opening the live broadcast room and selecting a somatosensory item duel, an opponent matching request is sent to the server, and then in response to an interaction start instruction sent by the server, the user screen containing the user's somatosensory action is started to be collected, Input the user screen into the somatosensory information acquisition model, obtain the somatosensory information output by the somatosensory information acquisition model corresponding to the user's somatosensory action, and finally obtain the interactive result of processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user, and The interactive results are displayed in the live interface to which the current user's user screen belongs, and can interact in the form of somatosensory items in the live broadcast room, which enriches the user interaction scene in the live broadcast room.
图5是根据一示例性实施例示出的一种直播体感项目互动装置框图。参照图5,该装置包括体感信息确定模块510,互动结果获取模块520和互动结果展示模块530。Fig. 5 is a block diagram of an interactive device for a live somatosensory item according to an exemplary embodiment. Referring to FIG. 5 , the apparatus includes a somatosensory information determination module 510 , an interaction result acquisition module 520 and an interaction result display module 530 .
体感信息确定模块510,被配置为执行根据采集到的当前用户的用户画面,确定与所述用户画面中的体感动作对应的体感信息;The somatosensory information determining module 510 is configured to determine the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user;
互动结果获取模块520,被配置为执行获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,其中,所述关联用户是与当前用户匹配的用户;The interaction result acquisition module 520 is configured to perform acquisition of an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein the associated user is a user matched with the current user;
互动结果展示模块530,被配置为执行将所述互动结果结合在所述当前用户画面所属直播界面中进行展示。The interactive result display module 530 is configured to display the interactive result in combination with the live interface to which the current user screen belongs.
在本公开实施例的一个实施方式中,互动结果获取模块520,包括:In one implementation of the embodiment of the present disclosure, the interaction result obtaining module 520 includes:
第一互动结果获取单元,被配置为执行获取基于所述当前用户的体感信息以及至少一个所述关联用户的体感信息进行体感动作评分后确定的评分结果,作为所述互动结果。The first interaction result obtaining unit is configured to execute a scoring result determined after performing a somatosensory action scoring based on the somatosensory information of the current user and the somatosensory information of at least one of the associated users, as the interaction result.
在本公开实施例的一个实施方式中,第一互动结果获取单元,还被配置为执行:In an implementation of the embodiment of the present disclosure, the first interaction result obtaining unit is further configured to execute:
将当前用户的体感信息与至少一个预设体感信息进行比对,确定与当前用户的体感信息匹配的目标预设体感信息;Compare the somatosensory information of the current user with at least one preset somatosensory information, and determine the target preset somatosensory information that matches the somatosensory information of the current user;
将所述目标预设体感信息对应的分数,作为当前用户的评分结果;Taking the score corresponding to the preset somatosensory information of the target as the scoring result of the current user;
接收至少一个所述关联用户发送的所述关联用户的评分结果;receiving the scoring result of the associated user sent by at least one of the associated users;
将所述当前用户的评分结果,以及至少一个所述关联用户的评分结果作为所述互动结果。The rating result of the current user and the rating result of at least one of the associated users are used as the interaction result.
在本公开实施例的一个实施方式中,互动结果获取模块520,还被配置为执行:In one implementation of the embodiment of the present disclosure, the interaction result obtaining module 520 is further configured to execute:
获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理后得到的体感动作图像,作为所述互动结果。A somatosensory action image obtained by processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user is acquired as the interaction result.
在本公开实施例的一个实施方式中,互动结果获取模块520,还被配置为执行:In one implementation of the embodiment of the present disclosure, the interaction result obtaining module 520 is further configured to execute:
通过所述当前用户和关联用户各自客户端之间的游戏信息链接桥,进行用户体感信息和/或互动结果的传输。The user somatosensory information and/or the interaction result is transmitted through the game information link bridge between the respective clients of the current user and the associated user.
在本公开实施例的一个实施方式中,互动结果获取模块520,还包括:In one implementation of the embodiment of the present disclosure, the interaction result obtaining module 520 further includes:
第二互动结果获取单元,被配置为执行基于所述当前用户的体感信息以及传输的所述关联用户的体感信息进行互动处理,以获取互动结果。The second interaction result obtaining unit is configured to perform interaction processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user, so as to obtain an interaction result.
在本公开实施例的一个实施方式中,第二互动结果获取单元,还被配置为执行:In one implementation of the embodiment of the present disclosure, the second interaction result obtaining unit is further configured to execute:
对当前用户的体感信息以及传输的所述关联用户的体感信息进行处理,确定当前用户和所述关联用户的项目信息,其中,所述项目信息包括评分结果和体感动作图像;Process the somatosensory information of the current user and the transmitted somatosensory information of the associated user, and determine the item information of the current user and the associated user, wherein the item information includes a scoring result and a somatosensory action image;
将当前用户和所述关联用户的项目信息共同作为互动结果。The item information of the current user and the associated user is taken together as an interaction result.
在本公开实施例的一个实施方式中,直播体感项目互动装置,还包括:In an implementation of the embodiments of the present disclosure, the interactive device for live broadcasting of somatosensory items further includes:
信息上报模块,被配置为执行将所述体感信息上报给服务器;an information reporting module, configured to perform reporting of the somatosensory information to the server;
相应的,互动结果获取模块520,还被配置为执行:Correspondingly, the interaction result obtaining module 520 is further configured to execute:
获取所述服务器基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果。Obtain an interaction result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
在本公开实施例的一个实施方式中,直播体感项目互动装置,还包括:In an implementation of the embodiments of the present disclosure, the interactive device for live broadcasting of somatosensory items further includes:
请求发送模块,被配置为执行响应于当前用户开启直播间并选择体感项目对决的操作,向服务器发送对手匹配请求;The request sending module is configured to perform the operation of opening the live room and selecting the somatosensory item duel in response to the current user, and send an opponent matching request to the server;
指令接收模块,被配置为执行接收所述服务器反馈的与当前用户匹配的至少一个关联用户,以及互动开始指令。The instruction receiving module is configured to receive at least one associated user that is fed back by the server and matched with the current user, and an interaction start instruction.
在本公开实施例的一个实施方式中,体感信息确定模块510,包括:In one implementation of the embodiment of the present disclosure, the somatosensory information determination module 510 includes:
用户画面采集单元,被配置为执行响应于服务器发送的互动开始指令,采集包含用户体感动作的用户画面;a user image acquisition unit, configured to execute an interaction start instruction sent by the server to collect user images including user somatosensory actions;
体感信息获取单元,被配置为执行将所述用户画面输入至体感信息获取模型,获取所述体感信息获取模型输出的与所述用户体感动作对应的体感信息。The somatosensory information acquisition unit is configured to execute inputting the user screen into a somatosensory information acquisition model, and acquire somatosensory information corresponding to the user's somatosensory action output by the somatosensory information acquisition model.
在本公开实施例的一个实施方式中,所述体感信息包含用户头部运动、面部运动和四肢运动中的至少一项运动信息。In an implementation manner of the embodiment of the present disclosure, the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
关于上述实施例中的直播体感项目互动装置,其中各个单元执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。Regarding the device for interacting with live somatosensory items in the foregoing embodiments, the specific manner in which each unit performs operations has been described in detail in the embodiments of the method, and will not be described in detail here.
图6是根据一示例性实施例示出的一种电子设备的结构示意图,如图6所示,该电子 设备包括:Fig. 6 is a schematic structural diagram of an electronic device shown according to an exemplary embodiment. As shown in Fig. 6, the electronic device includes:
处理器610, processor 610,
图6中以一个处理器610为例;A processor 610 is taken as an example in FIG. 6;
存储器620; memory 620;
所述设备中的处理器610和存储器620可以通过总线或者其他方式连接,图6中以通过总线连接为例。The processor 610 and the memory 620 in the device may be connected by a bus or in other ways, and the connection by a bus is taken as an example in FIG. 6 .
存储器620作为一种非暂态计算机可读存储介质,可用于存储软件程序、计算机可执行程序以及模块,如本公开实施例中的一种直播体感项目互动方法对应的程序指令/模块(例如,附图5所示的体感信息确定模块510,互动结果获取模块520和互动结果展示模块530)。处理器610通过运行存储在存储器620中的软件程序、指令以及模块,从而执行计算机设备的各种功能应用以及数据处理,即实现上述方法实施例的一种直播体感项目互动方法,即:As a non-transitory computer-readable storage medium, the memory 620 can be used to store software programs, computer-executable programs, and modules, such as program instructions/modules (for example, program instructions/modules corresponding to a live somatosensory item interaction method in the embodiments of the present disclosure). Figure 5 shows the somatosensory information determination module 510, the interaction result acquisition module 520 and the interaction result display module 530). The processor 610 executes various functional applications and data processing of the computer device by running the software programs, instructions and modules stored in the memory 620, that is, a method for interacting with a live somatosensory item of the above method embodiment, namely:
根据采集到的当前用户的用户画面,确定与所述用户画面中的体感动作对应的体感信息;According to the collected user screen of the current user, determine the somatosensory information corresponding to the somatosensory action in the user screen;
获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,其中,所述关联用户是与当前用户匹配的用户;acquiring an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein the associated user is a user matched with the current user;
将所述互动结果结合在所述当前用户画面所属直播界面中进行展示。The interactive result is displayed in the live interface to which the current user screen belongs.
存储器620可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储根据计算机设备的使用所创建的数据等。此外,存储器620可以包括高速随机存取存储器,还可以包括非暂态性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非暂态性固态存储器件。在一些实施例中,存储器620可选包括相对于处理器610远程设置的存储器,这些远程存储器可以通过网络连接至终端设备。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。The memory 620 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the computer equipment, and the like. Additionally, memory 620 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 620 may optionally include memory located remotely from the processor 610, and these remote memories may be connected to the terminal device through a network. Examples of such networks include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
在示例性实施例中,还提供了一种包括指令的存储介质,例如包括指令的存储器620,上述指令可由电子设备的处理器610执行以完成上述方法。存储介质可以是非临时性计算机可读存储介质,例如,所述非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。In an exemplary embodiment, a storage medium including instructions, such as a memory 620 including instructions, is also provided, and the above-mentioned instructions can be executed by the processor 610 of the electronic device to complete the above-mentioned method. The storage medium may be a non-transitory computer-readable storage medium such as ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
在一些实施例中,还提供了一种计算机程序产品,用于与电子设备结合使用,所述计算机程序产品包括计算机可读存储介质和内嵌于其中的计算机程序机制,经由计算机载入该程序并执行后能够实现如本公开任一实施例所述的直播体感项目互动方法。In some embodiments, there is also provided a computer program product for use in conjunction with an electronic device, the computer program product comprising a computer readable storage medium and a computer program mechanism embedded therein, the program being loaded via a computer And after execution, the interactive method for live broadcast somatosensory items as described in any embodiment of the present disclosure can be realized.
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本公开的其它实施方案。本公开旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为例如,本公开的真正范围和精神由下面的权利要求指出。Other embodiments of the present disclosure will readily occur to those skilled in the art upon consideration of the specification and practice of the invention disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of this disclosure that follow the general principles of this disclosure and include common general knowledge or techniques in the technical field not disclosed by this disclosure . The specification and examples are to be regarded, for example, only, with the true scope and spirit of the disclosure being indicated by the following claims.
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限制。It is to be understood that the present disclosure is not limited to the precise structures described above and illustrated in the accompanying drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (44)

  1. 一种直播体感项目互动方法,应用于客户端,所述方法包括:A method for interacting with a live somatosensory item, applied to a client, the method includes:
    根据采集到的当前用户的用户画面,确定与所述用户画面中的体感动作对应的体感信息;According to the collected user screen of the current user, determine the somatosensory information corresponding to the somatosensory action in the user screen;
    获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,其中,所述关联用户是与当前用户匹配的用户;acquiring an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein the associated user is a user matched with the current user;
    将所述互动结果结合在当前用户画面所属直播界面中进行展示。The interactive result is displayed in combination with the live interface to which the current user screen belongs.
  2. 根据权利要求1所述的直播体感项目互动方法,其中,所述获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果包括:The method for interacting with a live broadcast somatosensory item according to claim 1, wherein the obtaining an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user comprises:
    获取基于所述当前用户的体感信息以及至少一个所述关联用户的体感信息进行体感动作评分后确定的评分结果,作为所述互动结果。A scoring result determined after performing somatosensory action scoring based on the somatosensory information of the current user and the somatosensory information of at least one associated user is obtained as the interaction result.
  3. 根据权利要求2所述的直播体感项目互动方法,其中,所述获取基于所述当前用户的体感信息以及至少一个所述关联用户的体感信息进行体感动作评分后确定的评分结果,作为所述互动结果包括:The method for interacting with a live broadcast somatosensory item according to claim 2, wherein the obtaining a scoring result determined after somatosensory action scoring based on the somatosensory information of the current user and the somatosensory information of at least one of the associated users is used as the interaction Results include:
    将当前用户的体感信息与至少一个预设体感信息进行比对,确定与当前用户的体感信息匹配的目标预设体感信息;Compare the somatosensory information of the current user with at least one preset somatosensory information, and determine the target preset somatosensory information that matches the somatosensory information of the current user;
    将所述目标预设体感信息对应的分数,作为当前用户的评分结果;Taking the score corresponding to the preset somatosensory information of the target as the scoring result of the current user;
    接收至少一个所述关联用户发送的所述关联用户的评分结果;receiving the scoring result of the associated user sent by at least one of the associated users;
    将所述当前用户的评分结果,以及至少一个所述关联用户的评分结果作为所述互动结果。The rating result of the current user and the rating result of at least one of the associated users are used as the interaction result.
  4. 根据权利要求1所述的直播体感项目互动方法,其中,所述获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果包括:The method for interacting with a live broadcast somatosensory item according to claim 1, wherein the obtaining an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user comprises:
    获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理后得到的体感动作图像,作为所述互动结果。A somatosensory action image obtained by processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user is acquired as the interaction result.
  5. 根据权利要求1所述的直播体感项目互动方法,其中,所述获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果包括:The method for interacting with a live broadcast somatosensory item according to claim 1, wherein the acquiring an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user comprises:
    通过所述当前用户和关联用户各自客户端之间的游戏信息链接桥,进行用户体感信息和/或互动结果的传输。The user somatosensory information and/or the interaction result is transmitted through the game information link bridge between the respective clients of the current user and the associated user.
  6. 根据权利要求5所述的直播体感项目互动方法,其中,所述方法还包括:The method for interacting with live somatosensory items according to claim 5, wherein the method further comprises:
    基于所述当前用户的体感信息以及传输的所述关联用户的体感信息进行互动处理,以获取互动结果。Interactive processing is performed based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain an interactive result.
  7. 根据权利要求6所述的直播体感项目互动方法,其中,所述基于所述当前用户的体感信息以及传输的所述关联用户的体感信息进行处理,以获取互动结果包括:The method for interacting with a live broadcast somatosensory item according to claim 6, wherein the processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain an interaction result comprises:
    对当前用户的体感信息以及传输的所述关联用户的体感信息进行处理,确定当前用户和所述关联用户的项目信息,其中,所述项目信息包括评分结果和体感动作图像;Process the somatosensory information of the current user and the transmitted somatosensory information of the associated user, and determine the item information of the current user and the associated user, wherein the item information includes a scoring result and a somatosensory action image;
    将当前用户和所述关联用户的项目信息共同作为互动结果。The item information of the current user and the associated user is taken together as an interaction result.
  8. 根据权利要求1所述的直播体感项目互动方法,其中,所述方法还包括:The method for interacting with live somatosensory items according to claim 1, wherein the method further comprises:
    将所述体感信息上报给服务器;reporting the somatosensory information to the server;
    相应的,则获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果包括:Correspondingly, acquiring the interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user includes:
    获取所述服务器基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果。Obtain an interaction result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
  9. 根据权利要求1所述的直播体感项目互动方法,其中,所述方法还包括:The method for interacting with live somatosensory items according to claim 1, wherein the method further comprises:
    响应于当前用户开启直播间并选择体感项目对决的操作,向服务器发送对手匹配请求;In response to the current user's operation of opening the live broadcast room and selecting the somatosensory item duel, send an opponent matching request to the server;
    接收所述服务器反馈的与当前用户匹配的至少一个关联用户,以及互动开始指令。Receive at least one associated user that matches the current user fed back by the server, and an interaction start instruction.
  10. 根据权利要求1所述的直播体感项目互动方法,其中,所述根据采集到的当前用户的用户画面,确定与所述用户画面中的体感动作对应的体感信息包括:The method for interacting with a live somatosensory item according to claim 1, wherein, according to the collected user screen of the current user, determining the somatosensory information corresponding to the somatosensory action in the user screen comprises:
    响应于服务器发送的互动开始指令,采集包含用户体感动作的用户画面;In response to the interaction start instruction sent by the server, collect the user screen including the user's somatosensory action;
    将所述用户画面输入至体感信息获取模型,获取所述体感信息获取模型输出的与所述用户体感动作对应的体感信息。The user screen is input into the somatosensory information acquisition model, and the somatosensory information corresponding to the user's somatosensory action output by the somatosensory information acquisition model is acquired.
  11. 根据权利要求1-10任一所述的直播体感项目互动方法,其中,所述体感信息包含用户头部运动、面部运动和四肢运动中的至少一项运动信息。The method for interacting with a live broadcast somatosensory item according to any one of claims 1-10, wherein the somatosensory information includes at least one piece of motion information among a user's head motion, facial motion, and limb motion.
  12. 一种直播体感项目互动装置,包括:An interactive device for live broadcast somatosensory projects, comprising:
    体感信息确定模块,被配置为执行根据采集到的当前用户的用户画面,确定与所述用户画面中的体感动作对应的体感信息;a somatosensory information determining module, configured to determine the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user;
    互动结果获取模块,被配置为执行获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,其中,所述关联用户是与当前用户匹配的用户;an interaction result acquisition module configured to perform acquisition of an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein the associated user is a user matched with the current user;
    互动结果展示模块,被配置为执行将所述互动结果结合在当前用户画面所属直播界面中进行展示。The interactive result display module is configured to display the interactive result in combination with the live interface to which the current user screen belongs.
  13. 根据权利要求12所述的直播体感项目互动装置,其中,所述互动结果获取模块,包括:The device for interacting with live somatosensory items according to claim 12, wherein the interaction result acquisition module comprises:
    第一互动结果获取单元,被配置为执行获取基于所述当前用户的体感信息以及至少一个所述关联用户的体感信息进行体感动作评分后确定的评分结果,作为所述互动结果。The first interaction result obtaining unit is configured to execute a scoring result determined after performing a somatosensory action scoring based on the somatosensory information of the current user and the somatosensory information of at least one of the associated users, as the interaction result.
  14. 根据权利要求13所述的直播体感项目互动装置,其中,所述第一互动结果获取单元,还被配置为执行:The device for interacting with live somatosensory items according to claim 13, wherein the first interaction result obtaining unit is further configured to execute:
    将当前用户的体感信息与至少一个预设体感信息进行比对,确定与当前用户的体感信息匹配的目标预设体感信息;Compare the somatosensory information of the current user with at least one preset somatosensory information, and determine the target preset somatosensory information that matches the somatosensory information of the current user;
    将所述目标预设体感信息对应的分数,作为当前用户的评分结果;Taking the score corresponding to the preset somatosensory information of the target as the scoring result of the current user;
    接收至少一个所述关联用户发送的所述关联用户的评分结果;receiving the scoring result of the associated user sent by at least one of the associated users;
    将所述当前用户的评分结果,以及至少一个所述关联用户的评分结果作为所述互动结果。The rating result of the current user and the rating result of at least one of the associated users are used as the interaction result.
  15. 根据权利要求12所述的直播体感项目互动装置,其中,所述互动结果获取模块,还被配置为执行:The device for interacting with live somatosensory items according to claim 12, wherein the interaction result acquisition module is further configured to execute:
    获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理后得到的体感动作图像,作为所述互动结果。A somatosensory action image obtained by processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user is acquired as the interaction result.
  16. 根据权利要求12所述的直播体感项目互动装置,其中,所述互动结果获取模块,还被配置为执行:The device for interacting with live somatosensory items according to claim 12, wherein the interaction result acquisition module is further configured to execute:
    通过所述当前用户和关联用户各自客户端之间的游戏信息链接桥,进行用户体感信息和/或互动结果的传输。The user somatosensory information and/or the interaction result is transmitted through the game information link bridge between the respective clients of the current user and the associated user.
  17. 根据权利要求16所述的直播体感项目互动装置,其中,所述互动结果获取模块,还包括:The device for interacting with live somatosensory items according to claim 16, wherein the interaction result acquisition module further comprises:
    第二互动结果获取单元,被配置为执行基于所述当前用户的体感信息以及传输的所述关联用户的体感信息进行互动处理,以获取互动结果。The second interaction result obtaining unit is configured to perform interaction processing based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user, so as to obtain an interaction result.
  18. 根据权利要求17所述的直播体感项目互动装置,其中,所述第二互动结果获取单元,还被配置为执行:The device for interacting with live somatosensory items according to claim 17, wherein the second interaction result obtaining unit is further configured to execute:
    对当前用户的体感信息以及传输的所述关联用户的体感信息进行处理,确定当前用户和所述关联用户的项目信息,其中,所述项目信息包括评分结果和体感动作图像;Process the somatosensory information of the current user and the transmitted somatosensory information of the associated user, and determine the item information of the current user and the associated user, wherein the item information includes a scoring result and a somatosensory action image;
    将当前用户和所述关联用户的项目信息共同作为互动结果。The item information of the current user and the associated user is taken together as an interaction result.
  19. 根据权利要求12所述的直播体感项目互动装置,其中,所述直播体感项目互动装置,还包括:The interactive device for live broadcast somatosensory items according to claim 12, wherein the interactive device for live broadcast somatosensory items further comprises:
    信息上报模块,被配置为执行将所述体感信息上报给服务器;an information reporting module, configured to perform reporting of the somatosensory information to the server;
    相应的,所述互动结果获取模块,还被配置为执行:Correspondingly, the interaction result obtaining module is further configured to execute:
    获取所述服务器基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果。Obtain an interaction result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
  20. 根据权利要求12所述的直播体感项目互动装置,其中,所述直播体感项目互动装置,还包括:The interactive device for live broadcast somatosensory items according to claim 12, wherein the interactive device for live broadcast somatosensory items further comprises:
    请求发送模块,被配置为执行响应于当前用户开启直播间并选择体感项目对决的操作,向服务器发送对手匹配请求;The request sending module is configured to perform the operation of opening the live room and selecting the somatosensory item duel in response to the current user, and send an opponent matching request to the server;
    指令接收模块,被配置为执行接收所述服务器反馈的与当前用户匹配的至少一个关联用户,以及互动开始指令。The instruction receiving module is configured to receive at least one associated user that is fed back by the server and matched with the current user, and an interaction start instruction.
  21. 根据权利要求12所述的直播体感项目互动装置,其中,所述体感信息确定模块,包括:The interactive device for live broadcast somatosensory items according to claim 12, wherein the somatosensory information determination module comprises:
    用户画面采集单元,被配置为执行响应于服务器发送的互动开始指令,采集包含用户体感动作的用户画面;a user image acquisition unit, configured to execute an interaction start instruction sent by the server to collect user images including user somatosensory actions;
    体感信息获取单元,被配置为执行将所述用户画面输入至体感信息获取模型,获取所述体感信息获取模型输出的与所述用户体感动作对应的体感信息。The somatosensory information acquisition unit is configured to execute inputting the user screen into a somatosensory information acquisition model, and acquire somatosensory information corresponding to the user's somatosensory action output from the somatosensory information acquisition model.
  22. 根据权利要求12-21所述的直播体感项目互动装置,其中,所述体感信息包含用户头部运动、面部运动和四肢运动中的至少一项运动信息。The interactive device for live broadcast somatosensory items according to claims 12-21, wherein the somatosensory information includes at least one piece of motion information among the user's head motion, facial motion, and limb motion.
  23. 一种电子设备,包括:An electronic device comprising:
    处理器;processor;
    用于存储所述处理器可执行命令的存储器;memory for storing instructions executable by the processor;
    其中,所述处理器被配置为执行所述命令,以实现以下步骤:根据采集到的当前用户的用户画面,确定与所述用户画面中的体感动作对应的体感信息;Wherein, the processor is configured to execute the command to implement the following steps: determine the somatosensory information corresponding to the somatosensory action in the user screen according to the collected user screen of the current user;
    获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,其中,所述关联用户是与当前用户匹配的用户;acquiring an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein the associated user is a user matched with the current user;
    将所述互动结果结合在当前用户画面所属直播界面中进行展示。The interactive result is displayed in combination with the live interface to which the current user screen belongs.
  24. 根据权利要求23所述的电子设备,其中, 所述处理器被配置为执行所述指令时还实现以下步骤:The electronic device of claim 23, wherein the processor is configured to further implement the following steps when executing the instructions:
    获取基于所述当前用户的体感信息以及至少一个所述关联用户的体感信息进行体感动作评分后确定的评分结果,作为所述互动结果。A scoring result determined after performing somatosensory action scoring based on the somatosensory information of the current user and the somatosensory information of at least one associated user is obtained as the interaction result.
  25. 根据权利要求24所述的电子设备,其中,所述处理器被配置为执行所述指令时还实现以下步骤:24. The electronic device of claim 24, wherein the processor is configured to further implement the following steps when executing the instructions:
    将当前用户的体感信息与至少一个预设体感信息进行比对,确定与当前用户的体感信息匹配的目标预设体感信息;Compare the somatosensory information of the current user with at least one preset somatosensory information, and determine the target preset somatosensory information that matches the somatosensory information of the current user;
    将所述目标预设体感信息对应的分数,作为当前用户的评分结果;Taking the score corresponding to the preset somatosensory information of the target as the scoring result of the current user;
    接收至少一个所述关联用户发送的所述关联用户的评分结果;receiving the scoring result of the associated user sent by at least one of the associated users;
    将所述当前用户的评分结果,以及至少一个所述关联用户的评分结果作为所述互动结果。The rating result of the current user and the rating result of at least one of the associated users are used as the interaction result.
  26. 根据权利要求23所述的电子设备,其中,所述处理器被配置为执行所述指令时还实现以下步骤:23. The electronic device of claim 23, wherein the processor is configured to further implement the following steps when executing the instructions:
    获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理后得到的体感动作图像,作为所述互动结果。A somatosensory action image obtained by processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user is acquired as the interaction result.
  27. 根据权利要求23所述的电子设备,其中,所述处理器被配置为执行所述指令时还实现以下步骤:23. The electronic device of claim 23, wherein the processor is configured to further implement the following steps when executing the instructions:
    通过所述当前用户和关联用户各自客户端之间的游戏信息链接桥,进行用户体感信息和/或互动结果的传输。The user somatosensory information and/or the interaction result is transmitted through the game information link bridge between the respective clients of the current user and the associated user.
  28. 根据权利要求27所述的电子设备,其中,所述处理器被配置为执行所述指令时还实现以下步骤:28. The electronic device of claim 27, wherein the processor is configured to further implement the following steps when executing the instructions:
    基于所述当前用户的体感信息以及传输的所述关联用户的体感信息进行互动处理,以获取互动结果。Interactive processing is performed based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain an interactive result.
  29. 根据权利要求28所述的电子设备,其中,所述处理器被配置为执行所述指令时还实现以下步骤:28. The electronic device of claim 28, wherein the processor is configured to execute the instructions to further implement the following steps:
    对当前用户的体感信息以及传输的所述关联用户的体感信息进行处理,确定当前用户和所述关联用户的项目信息,其中,所述项目信息包括评分结果和体感动作图像;Process the somatosensory information of the current user and the transmitted somatosensory information of the associated user, and determine the item information of the current user and the associated user, wherein the item information includes a scoring result and a somatosensory action image;
    将当前用户和所述关联用户的项目信息共同作为互动结果。The item information of the current user and the associated user is taken together as an interaction result.
  30. 根据权利要求23所述的电子设备,其中,所述处理器被配置为执行所述指令时还实现以下步骤:23. The electronic device of claim 23, wherein the processor is configured to further implement the following steps when executing the instructions:
    将所述体感信息上报给服务器;reporting the somatosensory information to the server;
    相应的,则获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果包括:Correspondingly, acquiring the interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user includes:
    获取所述服务器基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果。Obtain an interaction result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
  31. 根据权利要求23所述的电子设备,其中,所述处理器被配置为执行所述指令时还实现以下步骤:23. The electronic device of claim 23, wherein the processor is configured to further implement the following steps when executing the instructions:
    响应于当前用户开启直播间并选择体感项目对决的操作,向服务器发送对手匹配请求;In response to the current user's operation of opening the live broadcast room and selecting the somatosensory item duel, send an opponent matching request to the server;
    接收所述服务器反馈的与当前用户匹配的至少一个关联用户,以及互动开始指令。Receive at least one associated user that matches the current user fed back by the server, and an interaction start instruction.
  32. 根据权利要求23所述的电子设备,其中,所述处理器被配置为执行所述指令时还实现以下步骤:23. The electronic device of claim 23, wherein the processor is configured to further implement the following steps when executing the instructions:
    响应于服务器发送的互动开始指令,采集包含用户体感动作的用户画面;In response to the interaction start instruction sent by the server, collect the user screen including the user's somatosensory action;
    将所述用户画面输入至体感信息获取模型,获取所述体感信息获取模型输出的与所述用户体感动作对应的体感信息。The user screen is input into the somatosensory information acquisition model, and the somatosensory information corresponding to the user's somatosensory action output by the somatosensory information acquisition model is acquired.
  33. 根据权利要求23-32任一所述的电子设备,其中,所述体感信息包含用户头部运动、面部运动和四肢运动中的至少一项运动信息。The electronic device according to any one of claims 23-32, wherein the somatosensory information includes at least one movement information of the user's head movement, facial movement and limb movement.
  34. 一种非易失性计算机可读存储介质,当所述存储介质中的命令由电子设备的处理器执行时,使得电子设备能够执行以下步骤:根据采集到的当前用户的用户画面,确定与所述用户画面中的体感动作对应的体感信息;A non-volatile computer-readable storage medium, when a command in the storage medium is executed by a processor of an electronic device, the electronic device can perform the following steps: according to the collected user screen of the current user, determine a Describe the somatosensory information corresponding to the somatosensory action in the user screen;
    获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果,其中,所述关联用户是与当前用户匹配的用户;acquiring an interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user, wherein the associated user is a user matched with the current user;
    将所述互动结果结合在当前用户画面所属直播界面中进行展示。The interactive result is displayed in combination with the live interface to which the current user screen belongs.
  35. 根据权利要求34所述的存储介质,其中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:The storage medium according to claim 34, wherein when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device is further enabled to perform the following steps:
    获取基于所述当前用户的体感信息以及至少一个所述关联用户的体感信息进行体感动作评分后确定的评分结果,作为所述互动结果。A scoring result determined after performing somatosensory action scoring based on the somatosensory information of the current user and the somatosensory information of at least one associated user is obtained as the interaction result.
  36. 根据权利要求35所述的存储介质,其中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:The storage medium of claim 35, wherein when the instructions in the storage medium are executed by a processor of the electronic device, the electronic device is further enabled to perform the following steps:
    将当前用户的体感信息与至少一个预设体感信息进行比对,确定与当前用户的体感信 息匹配的目标预设体感信息;Compare the somatosensory information of the current user with at least one preset somatosensory information, and determine the target preset somatosensory information that matches the somatosensory information of the current user;
    将所述目标预设体感信息对应的分数,作为当前用户的评分结果;Taking the score corresponding to the preset somatosensory information of the target as the scoring result of the current user;
    接收至少一个所述关联用户发送的所述关联用户的评分结果;receiving the scoring result of the associated user sent by at least one of the associated users;
    将所述当前用户的评分结果,以及至少一个所述关联用户的评分结果作为所述互动结果。The rating result of the current user and the rating result of at least one of the associated users are used as the interaction result.
  37. 根据权利要求34所述的存储介质,其中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:The storage medium according to claim 34, wherein when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device is further enabled to perform the following steps:
    获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理后得到的体感动作图像,作为所述互动结果。A somatosensory action image obtained by processing based on the somatosensory information of the current user and the somatosensory information of at least one associated user is acquired as the interaction result.
  38. 根据权利要求34所述的存储介质,其中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:The storage medium according to claim 34, wherein when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device is further enabled to perform the following steps:
    通过所述当前用户和关联用户各自客户端之间的游戏信息链接桥,进行用户体感信息和/或互动结果的传输。The user somatosensory information and/or the interaction result is transmitted through the game information link bridge between the respective clients of the current user and the associated user.
  39. 根据权利要求38所述的存储介质,其中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:The storage medium of claim 38, wherein, when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device is further enabled to perform the following steps:
    基于所述当前用户的体感信息以及传输的所述关联用户的体感信息进行互动处理,以获取互动结果。Interactive processing is performed based on the somatosensory information of the current user and the transmitted somatosensory information of the associated user to obtain an interactive result.
  40. 根据权利要求39所述的存储介质,其中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:The storage medium according to claim 39, wherein when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device is further able to perform the following steps:
    对当前用户的体感信息以及传输的所述关联用户的体感信息进行处理,确定当前用户和所述关联用户的项目信息,其中,所述项目信息包括评分结果和体感动作图像;Process the somatosensory information of the current user and the transmitted somatosensory information of the associated user, and determine the item information of the current user and the associated user, wherein the item information includes a scoring result and a somatosensory action image;
    将当前用户和所述关联用户的项目信息共同作为互动结果。The item information of the current user and the associated user is taken together as an interaction result.
  41. 根据权利要求34所述的存储介质,其中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:The storage medium according to claim 34, wherein when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device is further enabled to perform the following steps:
    将所述体感信息上报给服务器;reporting the somatosensory information to the server;
    相应的,则获取基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果包括:Correspondingly, acquiring the interaction result processed based on the somatosensory information of the current user and the somatosensory information of at least one associated user includes:
    获取所述服务器基于所述当前用户的体感信息以及至少一个关联用户的体感信息进行处理的互动结果。Obtain an interaction result processed by the server based on the somatosensory information of the current user and the somatosensory information of at least one associated user.
  42. 根据权利要求34所述的存储介质,其中,当所述存储介质中的指令由电子设 备的处理器执行时,使得电子设备还能够执行以下步骤:The storage medium of claim 34, wherein when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device is further enabled to perform the following steps:
    响应于当前用户开启直播间并选择体感项目对决的操作,向服务器发送对手匹配请求;In response to the current user's operation of opening the live broadcast room and selecting the somatosensory item duel, send an opponent matching request to the server;
    接收所述服务器反馈的与当前用户匹配的至少一个关联用户,以及互动开始指令。Receive at least one associated user that matches the current user fed back by the server, and an interaction start instruction.
  43. 根据权利要求34所述的存储介质,其中,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备还能够执行以下步骤:The storage medium according to claim 34, wherein when the instructions in the storage medium are executed by the processor of the electronic device, the electronic device is further enabled to perform the following steps:
    响应于服务器发送的互动开始指令,采集包含用户体感动作的用户画面;In response to the interaction start instruction sent by the server, collect the user screen including the user's somatosensory action;
    将所述用户画面输入至体感信息获取模型,获取所述体感信息获取模型输出的与所述用户体感动作对应的体感信息。The user screen is input into the somatosensory information acquisition model, and the somatosensory information corresponding to the user's somatosensory action output by the somatosensory information acquisition model is acquired.
  44. 根据权利要求34-43任一所述的存储介质,其中,所述体感信息包含用户头部运动、面部运动和四肢运动中的至少一项运动信息。The storage medium according to any one of claims 34-43, wherein the somatosensory information includes at least one piece of motion information among a user's head motion, facial motion, and limb motion.
PCT/CN2021/105588 2020-07-24 2021-07-09 Livestream motion sensing project interaction method and apparatus WO2022017201A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010725851.8 2020-07-24
CN202010725851.8A CN111866535B (en) 2020-07-24 2020-07-24 Live somatosensory item interaction method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
WO2022017201A1 true WO2022017201A1 (en) 2022-01-27

Family

ID=72949507

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/105588 WO2022017201A1 (en) 2020-07-24 2021-07-09 Livestream motion sensing project interaction method and apparatus

Country Status (2)

Country Link
CN (1) CN111866535B (en)
WO (1) WO2022017201A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111866535B (en) * 2020-07-24 2022-09-02 北京达佳互联信息技术有限公司 Live somatosensory item interaction method, device, equipment and storage medium
CN113194321B (en) * 2021-03-22 2023-02-17 北京达佳互联信息技术有限公司 Interaction method and interaction device for live broadcast room, electronic equipment and storage medium
CN113453032B (en) * 2021-06-28 2022-09-30 广州虎牙科技有限公司 Gesture interaction method, device, system, server and storage medium
CN113965771A (en) * 2021-10-22 2022-01-21 成都天翼空间科技有限公司 VR live broadcast user interactive experience system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170209795A1 (en) * 2016-01-27 2017-07-27 Electronic Arts Inc. Systems and methods for capturing participant likeness for a video game character
CN107911724A (en) * 2017-11-21 2018-04-13 广州华多网络科技有限公司 Living broadcast interactive method, apparatus and system
CN110191353A (en) * 2019-06-12 2019-08-30 北京百度网讯科技有限公司 Live streaming connects method, apparatus, equipment and the computer readable storage medium of wheat
CN110213613A (en) * 2018-08-09 2019-09-06 腾讯科技(深圳)有限公司 Image processing method, device and storage medium
CN111432266A (en) * 2020-03-31 2020-07-17 北京达佳互联信息技术有限公司 Interactive information display method, device, terminal and storage medium
CN111866535A (en) * 2020-07-24 2020-10-30 北京达佳互联信息技术有限公司 Live somatosensory item interaction method, device, equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108616761B (en) * 2018-05-23 2021-03-16 广州方硅信息技术有限公司 Multi-person video interaction method, device, system, storage medium and computer equipment
CN109348241B (en) * 2018-10-26 2021-05-14 广州方硅信息技术有限公司 Video playing method and device in multi-user video live broadcasting room and computer equipment
CN110149526B (en) * 2019-05-29 2021-11-02 北京达佳互联信息技术有限公司 Live broadcast interactive system, control method and device thereof and storage medium
CN110460867B (en) * 2019-07-31 2021-08-31 广州方硅信息技术有限公司 Connecting interaction method, connecting interaction system, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170209795A1 (en) * 2016-01-27 2017-07-27 Electronic Arts Inc. Systems and methods for capturing participant likeness for a video game character
CN107911724A (en) * 2017-11-21 2018-04-13 广州华多网络科技有限公司 Living broadcast interactive method, apparatus and system
CN110213613A (en) * 2018-08-09 2019-09-06 腾讯科技(深圳)有限公司 Image processing method, device and storage medium
CN110191353A (en) * 2019-06-12 2019-08-30 北京百度网讯科技有限公司 Live streaming connects method, apparatus, equipment and the computer readable storage medium of wheat
CN111432266A (en) * 2020-03-31 2020-07-17 北京达佳互联信息技术有限公司 Interactive information display method, device, terminal and storage medium
CN111866535A (en) * 2020-07-24 2020-10-30 北京达佳互联信息技术有限公司 Live somatosensory item interaction method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN111866535A (en) 2020-10-30
CN111866535B (en) 2022-09-02

Similar Documents

Publication Publication Date Title
WO2022017201A1 (en) Livestream motion sensing project interaction method and apparatus
US11571620B2 (en) Using HMD camera touch button to render images of a user captured during game play
CN107029429B (en) System, method, and readable medium for implementing time-shifting tutoring for cloud gaming systems
CN107911724B (en) Live broadcast interaction method, device and system
US20220410007A1 (en) Virtual character interaction method and apparatus, computer device, and storage medium
US10155161B2 (en) Information processing system, computer-readable storage medium having information processing program stored therein, information processing apparatus, and information processing method
CN102947777B (en) Usertracking feeds back
JP2020072841A (en) Filtering and parental control method for limiting visual operation on head-mounted display
US20170087476A1 (en) Systems and Methods for Providing Augmented Data-Feed for Game Play Re-creation and Dynamic Replay Entry Points
US20170087475A1 (en) Systems and Methods for Providing Time-Shifted Intelligently Synchronized Game Video
CN110472099B (en) Interactive video generation method and device and storage medium
CN107096221A (en) System and method for providing time shift intelligent synchronization game video
CN110677685B (en) Network live broadcast display method and device
CN110324652A (en) Game interaction method and system, electronic equipment and the device with store function
CN116437137B (en) Live broadcast processing method and device, electronic equipment and storage medium
CN112287848A (en) Live broadcast-based image processing method and device, electronic equipment and storage medium
CN115631270A (en) Live broadcast method and device of virtual role, computer storage medium and terminal
CN114286021B (en) Rendering method, rendering device, server, storage medium, and program product
CN111773702A (en) Control method and device for live game
CN109039851B (en) Interactive data processing method and device, computer equipment and storage medium
WO2015058388A1 (en) Method and device for displaying image
JP2021111102A (en) Moving image generation device and live communication system
KR102294376B1 (en) Online Golf Lesson Service Systems and Methods
US11117051B2 (en) Video game program and game system
CN114425162A (en) Video processing method and related device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21846452

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21846452

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 19-05-2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21846452

Country of ref document: EP

Kind code of ref document: A1