CN117440184A - Live broadcast equipment and control method thereof - Google Patents

Live broadcast equipment and control method thereof Download PDF

Info

Publication number
CN117440184A
CN117440184A CN202311753676.3A CN202311753676A CN117440184A CN 117440184 A CN117440184 A CN 117440184A CN 202311753676 A CN202311753676 A CN 202311753676A CN 117440184 A CN117440184 A CN 117440184A
Authority
CN
China
Prior art keywords
live
display screen
light effect
live broadcast
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311753676.3A
Other languages
Chinese (zh)
Other versions
CN117440184B (en
Inventor
李永红
何文龙
刘军涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Electron Technology Co ltd
Original Assignee
Shenzhen Electron Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Electron Technology Co ltd filed Critical Shenzhen Electron Technology Co ltd
Priority to CN202311753676.3A priority Critical patent/CN117440184B/en
Publication of CN117440184A publication Critical patent/CN117440184A/en
Application granted granted Critical
Publication of CN117440184B publication Critical patent/CN117440184B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23424Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving splicing one content stream with another content stream, e.g. for inserting or substituting an advertisement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2387Stream processing in response to a playback request from an end-user, e.g. for trick-play
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • H04N21/2393Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests involving handling client requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25866Management of end-user data
    • H04N21/25875Management of end-user data involving end-user authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4318Generation of visual interfaces for content selection or interaction; Content or additional data rendering by altering the content in the rendering process, e.g. blanking, blurring or masking an image region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/437Interfacing the upstream path of the transmission network, e.g. for transmitting client requests to a VOD server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/441Acquiring end-user identification, e.g. using personal code sent by the remote control or by inserting a card
    • H04N21/4415Acquiring end-user identification, e.g. using personal code sent by the remote control or by inserting a card using biometric characteristics of the user, e.g. by voice recognition or fingerprint scanning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4854End-user interface for client configuration for modifying image parameters, e.g. image brightness, contrast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Marketing (AREA)
  • Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Biomedical Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Graphics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application relates to live broadcast equipment and a control method thereof, wherein the live broadcast equipment shoots live broadcast image data in front of a display screen through a camera module; transmitting the live image data to a live server; receiving media data of a live broadcasting room sent by the live broadcasting server; determining live broadcast scene information corresponding to the live broadcast image data in a scene control mode; determining corresponding light effect display picture data according to the live scene information; thereby show live light effect image at the light effect picture display area at display screen edge, realized that the light effect is broadcast according to the light effect picture display area play light effect of live scene control display screen and light effect change more nimble, need not additional light filling lamp, also need not the complicated operation of anchor, showing the effect of living broadcast that has promoted, give the better viewing experience of spectator to the live broadcast experience of anchor has been promoted.

Description

Live broadcast equipment and control method thereof
Technical Field
The application relates to the technical field of live broadcasting, in particular to live broadcasting equipment and a control method thereof.
Background
With the rapid development of internet technology and streaming media technology, network live broadcast gradually becomes an entertainment means, a social way and a marketing way which are becoming popular. More and more users have played a host and have performed various types of online interactions at a live room, such as live dancing, live singing, live selling, live interviewing, and so forth.
At present, a host broadcast mainly carries out live broadcast through a mobile phone, a support frame is generally adopted to fix the mobile phone and a front camera of the mobile phone is started to shoot live broadcast, and meanwhile, the host broadcast can also watch a live broadcast room picture displayed on a mobile phone display screen. For the effect of live broadcasting, the anchor usually needs to use special lighting equipment to supplement light, and adjusts the lighting effect of the lighting equipment through complex operation. However, the existing live broadcasting mode has single light effect, complicated adjustment operation and still has room for improvement in live broadcasting effect and live broadcasting experience of the host.
Disclosure of Invention
Based on this, the aim of the application is to provide a live broadcast equipment and control method of live broadcast equipment, control abundant light effect that can be convenient promotes live broadcast effect and live broadcast experience of host computer.
The first aspect of the embodiment of the application provides a control method of live broadcast equipment, wherein the live broadcast equipment comprises a camera module, a communication module, a display screen and a control module; the camera module, the communication module and the display screen are respectively connected to the control module; the camera module is used for shooting live image data in front of the display screen, and the communication module is used for being in communication connection with a live server; the display area of the display screen is divided into a live broadcasting room picture display area positioned in the center and a light effect picture display area positioned at the edge; the control method comprises the following steps:
Shooting live image data in front of the display screen through the camera module;
transmitting the live image data to a live server;
receiving live broadcasting room media data sent by the live broadcasting server, wherein the live broadcasting room media data is generated based on the live broadcasting image data;
responding to a scene light effect instruction triggered by a user, and setting a light effect control mode of the display screen as a scene control mode;
determining live broadcast scene information corresponding to the live broadcast image data in the scene control mode;
determining corresponding light effect display picture data according to the live scene information;
synthesizing display screen picture data according to the live broadcasting room media data and the light effect display picture data, wherein the display screen picture data comprise live broadcasting room media images positioned in a central area and live broadcasting light effect images positioned in an edge area, the live broadcasting room media images are generated by the live broadcasting room media data, and the live broadcasting light effect images are generated by the light effect display picture data;
and controlling the display screen to display the picture data of the display screen, wherein the live broadcasting room media image is displayed in a live broadcasting room picture display area of the display screen, and the live broadcasting light effect image is displayed in a light effect picture display area of the display screen.
The second aspect of the embodiment of the application provides a live broadcast device, which comprises a camera module, a display screen, a communication module, a memory and a control module; the camera module, the display screen, the communication module and the memory are respectively connected to the control module; the camera module is used for shooting live image data in front of the display screen; the communication module is used for being in communication connection with the live broadcast server; the memory stores a computer readable program which, when executed by the control module, implements the steps of the method according to any of the embodiments of the present application.
In the control method of the live broadcast equipment, live broadcast equipment shoots live broadcast image data in front of the display screen through the camera module; transmitting the live image data to a live server; receiving media data of a live broadcasting room sent by the live broadcasting server; responding to a scene light effect instruction triggered by a user, and setting a light effect control mode of the display screen as a scene control mode; determining live broadcast scene information corresponding to the live broadcast image data in the scene control mode; determining corresponding light effect display picture data according to the live scene information; synthesizing display screen picture data according to the live broadcasting room media data and the light effect display picture data; and controlling the display screen to display live broadcasting room media images in the picture data of the display screen in a live broadcasting room picture display area, and displaying live broadcasting light effect images in the picture data of the display screen in a light effect picture display area. The device has the advantages that the effect of automatically playing the light according to the edge area of the live broadcast scene control display screen is achieved, the light effect is changed more flexibly, additional light supplementing lamps are not needed, complicated operation of a host is not needed, the live broadcast effect is remarkably improved, better watching experience is given to audiences, and the live broadcast experience of the host is improved.
For a better understanding and implementation, the present application is described in detail below with reference to the drawings.
Drawings
Fig. 1 is an application scenario schematic diagram of a control method of a live broadcast device according to an embodiment of the present application;
fig. 2 is a flow chart of a control method of a live broadcast device according to an embodiment of the present application;
fig. 3 is a schematic step diagram of a music control mode of a live broadcast device according to an embodiment of the present application;
fig. 4 is a schematic diagram of steps of music light effect matching of the live broadcast device according to the embodiment of the present application;
fig. 5 is a schematic diagram illustrating steps of turning on light effect music of a live broadcast device according to an embodiment of the present application;
FIG. 6 is a schematic diagram illustrating steps of a gesture control mode of a live device according to an embodiment of the present application;
fig. 7 is a schematic diagram of steps of adaptive control of a display screen of a live broadcast device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings. Where the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated.
It should be understood that the embodiments described in the examples described below do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The terminology used in the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in this application, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Furthermore, in the description of the present application, unless otherwise indicated, "a plurality" means two or more. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items, e.g., a and/or B, may represent: a exists alone, a and B exist together, and B exists alone; the character "/" generally indicates that the context-dependent object is an "or" relationship.
It should be appreciated that, although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited to these terms, and these terms are merely used to distinguish between similar objects and do not necessarily describe a particular order or sequence or imply relative importance. The specific meaning of the terms in this application will be understood by those of ordinary skill in the art as the case may be. The words "if"/"if" as used herein may be interpreted as "at … …" or "at … …" or "in response to a determination", depending on the context.
Referring to fig. 1, fig. 1 is an application scenario schematic diagram of a control method of a live broadcast device according to an embodiment of the present application, where the application scenario includes a live broadcast device, and the live broadcast device is an intelligent device for live broadcast, and is provided with a camera module 101, a communication module 102, a display screen 103, and a control module 104; the camera module 101, the communication module 102 and the display screen 103 are respectively connected to the control module 104; the camera module 101 is used for shooting live image data in front of the display screen, and the communication module 102 is used for being in communication connection with a live server; the display area of the display 103 is divided into a live room screen display area 1031 at the center and a light effect screen display area 1032 at the edge.
In this embodiment, a host may log in an account number of a live client on a live broadcast device and start live broadcast, when live broadcast is started, the live broadcast device starts a camera module to shoot and uploads shot data to a live broadcast server through a communication module, and meanwhile receives live broadcast media data sent by the live broadcast server through the communication module, and synthesizes display screen picture data based on the live broadcast media data and light effect display picture data, wherein a central area of the display screen picture data is a live broadcast media image generated based on the live broadcast media data, and an edge area is a live broadcast light effect image generated based on the light effect display picture data. When the display screen of live broadcast equipment shows live broadcast lamp effect image, live broadcast lamp effect will carry out light filling to the anchor and the light atmosphere of living broadcast room is set aside for live broadcast image data that the camera module took and living broadcast room media data that generates based on live broadcast image data also take place the change of picture effect correspondingly, can present abundant picture effect for the spectator of living broadcast room.
Referring to fig. 2, an embodiment of the present application discloses a control method of a live broadcast device, where the control method includes the following steps:
s101: shooting live image data in front of the display screen through the camera module;
s102: transmitting the live image data to a live server;
s103: receiving live broadcasting room media data sent by the live broadcasting server, wherein the live broadcasting room media data is generated based on the live broadcasting image data;
s104: responding to a scene light effect instruction triggered by a user, and setting a light effect control mode of the display screen as a scene control mode;
s105: determining live broadcast scene information corresponding to the live broadcast image data in the scene control mode;
s106: determining corresponding light effect display picture data according to the live scene information;
s107: synthesizing display screen picture data according to the live broadcasting room media data and the light effect display picture data, wherein the display screen picture data comprise live broadcasting room media images positioned in a central area and live broadcasting light effect images positioned in an edge area, the live broadcasting room media images are generated by the live broadcasting room media data, and the live broadcasting light effect images are generated by the light effect display picture data;
S108: and controlling the display screen to display the picture data of the display screen, wherein the live broadcasting room media image is displayed in a live broadcasting room picture display area of the display screen, and the live broadcasting light effect image is displayed in a light effect picture display area of the display screen.
In the control method of the live broadcast equipment, live broadcast equipment shoots live broadcast image data in front of the display screen through the camera module; transmitting the live image data to a live server; receiving media data of a live broadcasting room sent by the live broadcasting server; responding to a scene light effect instruction triggered by a user, and setting a light effect control mode of the display screen as a scene control mode; determining live broadcast scene information corresponding to the live broadcast image data in the scene control mode; determining corresponding light effect display picture data according to the live scene information; synthesizing display screen picture data according to the live broadcasting room media data and the light effect display picture data; and controlling the display screen to display live broadcasting room media images in the picture data of the display screen in a live broadcasting room picture display area, and displaying live broadcasting light effect images in the picture data of the display screen in a light effect picture display area. The device has the advantages that the effect of automatically playing the light according to the edge area of the live broadcast scene control display screen is achieved, the light effect is changed more flexibly, additional light supplementing lamps are not needed, complicated operation of a host is not needed, the live broadcast effect is remarkably improved, better watching experience is given to audiences, and the live broadcast experience of the host is improved.
For step S101, live image data in front of the display screen is shot by the camera module.
For step S102, the live image data is sent to a live server.
For step S103, live room media data sent by the live server is received, where the live room media data is generated based on the live image data.
For step S104, in response to a scene light effect instruction triggered by a user, setting a light effect control mode of the display screen to a scene control mode.
The scene lighting effect instruction can be generated by live broadcast equipment or sent to the live broadcast equipment by other communication equipment. In one embodiment, a live device displays a preset scene light setting control on a display screen, the scene light setting control generating a scene light instruction in response to a touch by a user. In another embodiment, the live broadcast device is in communication connection with other communication devices through the communication module, and receives scene lighting effect instructions sent by the other communication devices.
In this embodiment, the display screen of the live broadcast device may have multiple light effect control modes, for example, a scene control mode, a gesture control mode, and a music control mode, and in different light effect control modes, the live broadcast device automatically controls live broadcast light effect images of the display screen according to different status information, for example, live broadcast scene information, music feature information, gesture feature information, and the like. In one embodiment, the live broadcast device may further set a light effect control mode of the display screen to a music control mode in response to a music light effect instruction triggered by a user. In another embodiment, the live broadcast device may further set the light effect control mode of the display screen to a gesture control mode in response to a gesture control instruction triggered by a user.
For step S105, in the scene control mode, the live image data is identified and corresponding live scene information is determined.
Wherein the live scene information is information for characterizing live scenes including live sales scenes, live singing scenes, live dancing scenes, and/or live interview scenes.
In one embodiment, the step of determining live scene information corresponding to the live image data in the scene control mode in step S105 includes:
step S1051, recognizing text information in the live image data, and determining at least one piece of live scene information matched with the text information through a preset semantic scene recognition model;
step S1052, if the matched live broadcast scene information is one, determining that the matched live broadcast scene information is the live broadcast scene information corresponding to the live broadcast image data;
step S1053, if the matched live broadcast scene information is more than two, identifying live broadcast feature data of a target anchor in the live broadcast image data, wherein the live broadcast feature data comprises body posture features and hand action features; and respectively determining the matching degree of the preset live broadcast feature data of each piece of matched live broadcast scene information and the live broadcast feature data of the target anchor, and determining the live broadcast scene information with the highest matching degree as the live broadcast scene information corresponding to the live broadcast image data.
The semantic scene recognition model is used for recognizing the semantics of the text information and recognizing the corresponding live scenes according to the semantics, and can be obtained by marking the corresponding live scene information such as live selling, live singing and the like on the common text information in various live rooms, adopting the text information marked with the live scene information as training data and training the training data through a neural network, and certainly, the training data can also be obtained through training through other training methods.
The live broadcast feature data of the target anchor refers to body feature data of the target anchor during live broadcast, such as body posture feature data, hand motion feature data, leg motion feature data, and the like, wherein the body posture feature is judged according to the overall architecture of the body, and can be a sitting posture, a standing posture or a lying posture, the hand motion feature is judged according to the motion of the hand, and can be that the palm is lifted upwards (an article), the hand is pulled upwards (an article), the hands are horizontally placed, or the hands are swung, and the like, the leg motion feature is judged according to the motion of the thigh, and can be that the user stands vertically, jumps up and down, jumps left and right, and the like.
In this embodiment, text information in the live image data is first identified, at least one live scene information matched with the text information is determined through a preset semantic scene identification model, and if only one live scene information is matched, the live scene information is the live scene information corresponding to the live image data. Whereas if there are multiple matching live scene information, for example, some text information may be associated with multiple different live scenes, which may result in multiple matching live scene information, for this embodiment further determines which live scene is closer according to the body pose and action characteristics of the host, and in general, the body pose and action of the host in different types of live scenes are different, for example, the host in a live interview scene is usually sitting and lying with both hands (on a table), while in a live market scene is usually sitting and holding the object with the hands up, and in a live dancing scene is usually standing and swinging with both hands, so on, so by identifying the live feature data of the target host in the live image data; and acquiring preset live broadcast feature data of each piece of matched live broadcast scene information, respectively calculating the matching degree with the live broadcast feature data of the target anchor, and finally determining the live broadcast scene information with the highest matching degree as the live broadcast scene information corresponding to the live broadcast image data.
For step S106, corresponding light effect display picture data is determined according to the live scene information.
In this embodiment, a plurality of light effect display picture data are preset, and different live broadcast scenes bind the corresponding light effect display picture data respectively.
The light effect display picture data are image data for displaying in a display screen, and comprise a plurality of frames of light effect image data. The frame light effect image data may be all white light image data (corresponding to R, G, B pixel values are 255 in an RGB display system, wherein the light emitting power of the display screen may be adjusted to emit different brightness), all black image data (corresponding to R, G, B pixel values are 0 in the RGB display system), or all color image data (corresponding to R, G, B pixel values are not 255 and at least one is not 0 in the RGB display system), where the light emitting power of the display screen may be adjusted to emit different brightness, or a combination of multiple different types of image data. In this embodiment, the plurality of frames of light effect image data may be at least one frame of white light image data and/or at least one frame of full black image data and/or at least one frame of color image data, which are combined according to a certain rule, so that when the plurality of frames of light effect image data are continuously displayed frame by frame, a rich change effect of light images can be presented.
In one embodiment, the number of frames of light effect image data includes one or more of white light image data, full black image data, color image data, pattern image data.
Wherein the color image data includes image data of at least one color; the white light image data at least comprises two white light image data with different brightness, the color image data at least comprises two color image data with different brightness, and the pattern image data at least comprises two pattern image data with different brightness.
For step S107, display screen picture data is synthesized according to the live broadcast room media data and the light effect display picture data, where the display screen picture data includes a live broadcast room media image located in a center area and a live broadcast light effect image located in an edge area, the live broadcast room media image is generated by the live broadcast room media data, and the live broadcast light effect image is generated by the light effect display picture data.
The live broadcasting room media data are live broadcasting media stream data generated at a live broadcasting server according to live broadcasting image data sent by live broadcasting equipment, wherein the live broadcasting media stream data comprise a plurality of frames of live broadcasting room image data.
When the live broadcasting room media data and the light effect display picture data are synthesized, each frame of live broadcasting room image data is synthesized with each frame of light effect image data one by one according to the time stamp front-back sequence (each frame in the light effect display picture data is matched with the live broadcasting room image data according to the preset sequence) to form a frame of display screen picture image. The live broadcasting room image data are located in the middle area and correspond to the live broadcasting room picture display area of the display screen; the light effect image data is located in the edge area and corresponds to the light effect picture display area of the display screen.
In an optional embodiment, the step of synthesizing display screen frame data according to the live broadcast room media data and the light effect display frame data in step S107 includes:
step S1071, determining a first synthesis area of each frame of live broadcasting room image data according to the live broadcasting room picture display area of the display screen;
step S1072, determining a second synthesis area of each frame of light effect image data according to the light effect picture display area of the display screen;
step S1073, according to the first synthesis area and the second synthesis area, synthesizing each frame of live broadcasting room image data and a corresponding frame of light effect image data into a frame of display screen image data one by one, and obtaining a plurality of frames of synthesized display screen image data as the display screen picture data.
In this embodiment, a first synthesis area of the live broadcasting room image data and a second synthesis area of the light effect image data are determined according to a live broadcasting room image display area and a light effect image display area of the display screen respectively; and according to the first synthesis region and the second synthesis region, synthesizing each frame of live broadcasting room image data and a corresponding frame of light effect image data into one frame of display screen image data one by one, and obtaining a plurality of frames of synthesized display screen image data as display screen picture data. Therefore, when the live broadcast equipment plays the synthesized display screen picture data, the live broadcast room picture is displayed in the middle, the light effect picture is displayed at the edge, and the light emitted by the light effect picture can irradiate the anchor in front of the display screen to supplement light or increase the live broadcast light atmosphere.
And for step S108, controlling the display screen to display the picture data of the display screen, wherein the live broadcasting room media image is displayed in a live broadcasting room picture display area of the display screen, and the live broadcasting light effect image is displayed in a light effect picture display area of the display screen.
Referring to fig. 3, in one embodiment, the method for controlling a live broadcast device further includes the steps of:
S201: responding to a music light effect instruction triggered by a user, and setting a light effect control mode of the display screen to be a music control mode;
s202: in the music control mode, when the live broadcast equipment plays music, acquiring currently played music data;
s203: identifying rhythm feature information of the music data;
s204: acquiring light effect display picture data matched with the rhythm characteristic information;
s205: synthesizing display screen picture data according to the light effect display picture data and the live broadcasting room media data;
s206: and controlling the display screen to display the picture data of the display screen.
The music light effect instruction can be generated by live broadcast equipment or sent to the live broadcast equipment by other communication equipment. In one embodiment, the live device displays a preset music light effect trigger control on a display screen, the music light effect trigger control generating a music light effect instruction in response to a touch by a user. In another embodiment, the live broadcast device is in communication connection with other communication devices through the communication module, and receives the music light effect instruction sent by the other communication devices.
The light effect display picture data matched with the rhythm characteristic information refers to light effect display picture data matched with the rhythm characteristic information, specifically, light effect change rhythm data corresponding to different light effect display picture data can be preset in live broadcast equipment and compared with the rhythm characteristic information, and if the matching degree of the light effect change rhythm and the rhythm characteristic information of music is greater than a preset matching threshold, the corresponding light effect display picture data is the light effect display picture data matched with the rhythm characteristic information.
In the embodiment, in a music control mode, whether the live broadcast equipment is playing music is monitored, if yes, the currently played music data is obtained, the rhythm characteristic information of the currently played music data is identified, and then the light effect display picture data is matched according to the rhythm characteristic information, so that when the display screen picture data synthesized according to the light effect display picture data are played, the light effect picture display area at the edge of the display screen flashes the light effect corresponding to the music rhythm, the atmosphere of a live broadcast room can be enriched, the sense of motion is brought to a host and audience, and the live broadcast experience is improved.
Referring to fig. 4, in one embodiment, the method for controlling a live broadcast device further includes the steps of:
s301: responding to a light effect matching instruction triggered by a user, judging whether the live broadcast equipment is playing music, and if yes, acquiring currently played music data;
s302: identifying rhythm feature information of the music data;
s303: matching corresponding light effect type information according to the rhythm characteristic information;
s304: acquiring at least one light effect display picture data belonging to the light effect type according to the light effect type information, and displaying the at least one light effect display picture data through the display screen;
S305: receiving a selection instruction of a user for any one of the light effect display picture data displayed by the display screen, and synthesizing the display screen picture data according to the selected light effect display picture data and the live broadcasting room media data;
s306: and controlling the display screen to display the picture data of the display screen.
The light effect matching instruction can be generated by live broadcast equipment or sent to the live broadcast equipment by other communication equipment. In one embodiment, the live device displays a preset light effect matching trigger control on a display screen, the light effect matching trigger control generating a light effect matching instruction in response to a touch by a user. In another embodiment, the live broadcast device is in communication connection with other communication devices through the communication module, and receives a light effect matching instruction sent by the other communication devices.
In this embodiment, after the user triggers the light effect matching instruction, if the live broadcast device monitors that music is currently being played, the user obtains the currently played music data and identifies rhythm feature information, unlike the foregoing embodiment, where the live broadcast device matches the corresponding light effect type information according to the rhythm feature information, the light effect type information may cover multiple kinds of music with the rhythm feature, the live broadcast device obtains at least one light effect display picture data belonging to the light effect type according to the light effect type information, and displays the light effect display picture data for the user to select, and finally synthesizes the display screen picture data according to the light effect display picture data selected by the user and the media data in the live broadcast room, and when the display screen plays the display screen picture data, the light effect picture display area at the edge will flash the light effect corresponding to the rhythm change of the music, which can enrich the atmosphere in the room, bring the live broadcast and the audience, and promote the live broadcast experience.
Referring to fig. 5, in one embodiment, the method for controlling a live broadcast device further includes the steps of:
s401: responding to a lighting music starting instruction triggered by a user, acquiring a preset lighting music menu and displaying the lighting music menu through the display screen; the light effect music menu is used for displaying light effect music information, wherein the light effect music information is bound with preset light effect display picture data;
s402: receiving a light effect music selection instruction selected by a user, and determining corresponding light effect music information and bound light effect display picture data;
s403: and playing the corresponding light effect music according to the light effect music information, synthesizing display screen picture data according to the light effect display picture data and the live broadcasting room media data, and controlling the display screen to display the display screen picture data.
The lighting effect music starting instruction can be generated by live broadcast equipment or sent to the live broadcast equipment by other communication equipment. In one embodiment, the live device displays a preset light music on control on a display screen, the light music on control generating a light music on instruction in response to a touch by a user. In another embodiment, the live broadcast device is in communication connection with other communication devices through the communication module, and receives a light effect music starting instruction sent by the other communication devices.
In this embodiment, the live broadcast device presets multiple light effect music and each light effect music binds corresponding light effect display screen data, so that a user can directly trigger a light effect music start instruction to select light effect music to be played from a preset light effect music menu, and the live broadcast device synchronously synthesizes display screen data according to the bound light effect display screen data to display corresponding light effect screens when playing the light effect music. According to the embodiment, the live broadcast equipment is not required to recognize music rhythm characteristic information and match music, but the user presets favorite light effect music, so that the live broadcast equipment can directly select the music according to actual conditions when a live broadcast scene is needed, and the live broadcast equipment is more flexible and convenient and meets the requirements of the user.
In one embodiment, the method for controlling a live broadcast device further includes the steps of:
step S404, if a light effect music setting instruction triggered by a user is received, controlling the display screen to display a light effect music adding interface, wherein the light effect music adding interface comprises a light effect music adding control and a light effect adding control;
step S405, obtaining a light effect music adding instruction triggered by a user, wherein the light effect music adding instruction comprises light effect music information selected by the user;
Step S406, a light effect adding instruction triggered by a user is obtained, wherein the light effect adding instruction comprises light effect display picture data selected by the user;
step S407, binding the light effect music information and the light effect display picture data, and adding the light effect music information into a preset light effect music menu.
The lighting effect music setting instruction can be generated by live broadcast equipment or sent to the live broadcast equipment by other communication equipment. In one embodiment, the live device displays a preset light music setting control on a display screen, the light music setting control generating light music setting instructions in response to a touch by a user. In another embodiment, the live broadcast device is in communication connection with other communication devices through the communication module, and receives the light effect music setting instruction sent by the other communication devices.
The light music addition control generates a light music addition instruction in response to a touch of a user.
The light effect addition control generates a light effect addition instruction in response to a touch of a user.
The light effect music information is used for obtaining light effect music, and can be light effect music data directly or an identifier (such as song name) or a link to the corresponding light effect music data.
In this embodiment, the user may trigger the light effect music adding instruction, so as to select the light effect music that the user likes and the corresponding light effect display screen data, and add the light effect music data to the light effect music menu, so that the user can select the light effect music from the preset light effect music menu next time and make the display screen display the corresponding light effect screen.
Referring to fig. 6, in one embodiment, the method for controlling a live broadcast device further includes the steps of:
s501: responding to a gesture control instruction triggered by a user, and setting a light effect control mode of the display screen to be a gesture control mode;
s502: in the gesture control mode, recognizing a gesture action of a target anchor in the live image data, and determining light effect display picture data corresponding to the gesture action;
s503: and synthesizing display screen picture data according to the light effect display picture data and the live broadcasting room media data, and controlling the display screen to display the display screen picture data.
The gesture control instruction may be generated by a live broadcast device, or may be sent to the live broadcast device by other communication devices. In one embodiment, a live device displays a preset gesture control trigger control on a display screen, the gesture control trigger control generating gesture control instructions in response to a user's touch. In another embodiment, the live broadcast device is in communication connection with other communication devices through the communication module, and receives gesture control instructions sent by the other communication devices.
In this embodiment, the anchor may flexibly control the light effect screen of the display screen through gesture actions. The live broadcast equipment is preset with characteristic data of various gesture actions and respectively binds corresponding light effect display picture data, recognizes gesture actions of a target host in live broadcast image data under a gesture control mode, and determines the light effect display picture data corresponding to the gesture actions, so that the display screen picture data is synthesized according to the light effect display picture data and the media data in the live broadcast room, the light effect picture of the display screen is adjusted according to the gesture actions of the host, the host can flexibly control the light effect picture of the display screen, the interaction experience is enriched, and the host can conveniently build abundant live broadcast effects.
In one embodiment, in the step S502, in the gesture control mode, the step of identifying a gesture action of a target anchor in the live image data, and determining light effect display screen data corresponding to the gesture action includes:
step S5021, detecting a person target in the live image data and identifying a face of the person target;
step S5022, judging whether the face of the person target accords with the preset face feature with control authority, if so, determining that the person target is the target anchor.
In this embodiment, it is determined through face recognition that the person target in the live image data is a person having control authority or that it is only a person having control authority, and as long as the person target having control authority is the target anchor, the live apparatus in this embodiment recognizes the gesture motion only for the target anchor, and does not respond to other person targets and does not recognize the gesture motion. In an actual embodiment, the face features of the anchor with control authority may be preset, so that only the anchor can control the light effect picture of the display screen through gestures.
In one embodiment, in the step S502, in the gesture control mode, the step of identifying a gesture action of a target anchor in the live image data, and determining light effect display screen data corresponding to the gesture action includes:
step S5023, detecting a hand target of a target anchor in a video frame, and acquiring image coordinate information of the hand target;
step S5024, determining action characteristics of the hand target according to image coordinate information of the hand target of continuous multi-frame video frames;
step S5025, if the motion characteristics of the hand target conform to any preset hand motion characteristics, determining the light effect display picture data corresponding to the hand motion characteristics.
In this embodiment, a coordinate system of the live image may be pre-established, so that when a hand target in a video frame is identified, coordinate data of the hand target in a preset coordinate system may be obtained, that is, image coordinates.
In this embodiment, the live broadcast device identifies image coordinate information of a hand target in a video frame, determines motion characteristics of a hand according to changes of image coordinate information of consecutive multiple frames, matches the motion characteristics to corresponding hand motions, and determines corresponding light effect display picture data according to the hand motions.
Referring to fig. 7, in one embodiment, the live broadcast device includes a movable base, the display screen is disposed on the movable base, and the live broadcast device further includes a driving module disposed on the movable base for driving the display screen to rotate; the method further comprises the steps of:
s601: identifying viewing angle parameters of a target anchor in the live image data in response to a display screen adaptive control instruction;
s602: determining a display direction control parameter corresponding to the viewing angle parameter;
s603: and controlling the driving module to drive the display screen to rotate to the corresponding display direction according to the display direction control parameters.
The display screen self-adaptive control instruction can be generated by live broadcast equipment or can be sent to the live broadcast equipment by other communication equipment. In one embodiment, a live device displays a preset display screen adaptive trigger control on a display screen, the display screen adaptive trigger control generating display screen adaptive control instructions in response to a touch by a user. In another embodiment, the live broadcast device is in communication connection with other communication devices through the communication module, and receives display screen adaptive control instructions sent by the other communication devices.
According to the embodiment, the display screen is controlled to rotate to the corresponding display direction according to the viewing angle of the anchor, so that the viewing angle of the user is adapted, for example, when the anchor side head views the display screen, the display screen also rotates towards the direction of the anchor side head, so that the user can also watch the picture of the display screen under the side head, and the user can watch more comfortably.
In one embodiment, the step of identifying the viewing angle parameter of the target anchor in the live image data in response to the display screen adaptive control instruction in step S601 includes:
step S6011, identifying the face features of the target anchor through a preset face feature identification model, wherein the face features comprise eyes and noses;
Step S6012, obtaining the image coordinates of the two eyes and the image coordinates of the nose of the target anchor;
and step S6013, determining the image coordinates of the middle points of the eyes of the viewer according to the image coordinates of the two eyes, calculating deflection angle data of the image coordinates of the middle points of the two eyes relative to the image coordinates of the nose by taking the image coordinates of the nose as a circle center, and determining the deflection angle data as the viewing angle parameters of the target anchor.
The face feature recognition model is used for recognizing face features in the live image, and the face features are set to include eyes and noses in the embodiment, so that the face feature recognition model recognizes the eyes and the noses and acquires image coordinates of the eyes and the noses in the live image.
In this embodiment, the deflection angle of the midpoint of the two eyes relative to the nose is calculated by taking the nose as the center of a circle, so as to obtain the viewing angle parameters of the anchor. In general, when a presenter views, whether the presenter views at right or side heads, the nose of the presenter faces the display screen, and eyes deflect correspondingly relative to the nose according to the angle of the side heads, so that the viewing angle of the presenter can be accurately judged according to the deflection direction and the amplitude of the midpoints of the eyes of the presenter around the nose.
The embodiment of the application also discloses live broadcast equipment, which comprises a camera module, a display screen, a communication module, a memory and a control module; the camera module, the display screen, the communication module and the memory are respectively connected to the control module; the camera module is used for shooting live image data in front of the display screen; the communication module is used for being in communication connection with the live broadcast server; the memory stores a computer readable program which, when executed by the control module, implements the steps of the method according to any of the embodiments of the present application.
Wherein the control module may include one or more processing cores. The control module connects various parts within the live device using various interfaces and lines, performs various functions of the live device and processes data by running or executing instructions, programs, code sets or instruction sets stored in memory, and invoking data in memory, alternatively, the control module may be implemented in at least one hardware form of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programble Logic Array, PLA). The control module may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the touch display module; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the control module and may be implemented by a single chip.
The Memory may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). The memory may be used to store instructions, programs, code sets, or instruction sets. The memory may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as touch instructions, etc.), instructions for implementing the various method embodiments described above, etc.; the storage data area may store data or the like referred to in the above respective method embodiments. The memory may alternatively be at least one memory device located remotely from the control module.
The above examples merely represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the invention. It should be noted that modifications and improvements can be made by those skilled in the art without departing from the spirit of the present application, and the present application is intended to encompass such modifications and improvements.

Claims (10)

1. The control method of the live broadcast equipment is characterized in that the live broadcast equipment comprises a camera module, a communication module, a display screen and a control module; the camera module, the communication module and the display screen are respectively connected to the control module; the camera module is used for shooting live image data in front of the display screen, and the communication module is used for being in communication connection with a live server; the display area of the display screen is divided into a live broadcasting room picture display area positioned in the center and a light effect picture display area positioned at the edge; the control method comprises the following steps:
shooting live image data in front of the display screen through the camera module;
transmitting the live image data to a live server;
receiving live broadcasting room media data sent by the live broadcasting server, wherein the live broadcasting room media data is generated based on the live broadcasting image data;
responding to a scene light effect instruction triggered by a user, and setting a light effect control mode of the display screen as a scene control mode;
determining live broadcast scene information corresponding to the live broadcast image data in the scene control mode;
determining corresponding light effect display picture data according to the live scene information;
Synthesizing display screen picture data according to the live broadcasting room media data and the light effect display picture data, wherein the display screen picture data comprise live broadcasting room media images positioned in a central area and live broadcasting light effect images positioned in an edge area, the live broadcasting room media images are generated by the live broadcasting room media data, and the live broadcasting light effect images are generated by the light effect display picture data;
and controlling the display screen to display the picture data of the display screen, wherein the live broadcasting room media image is displayed in a live broadcasting room picture display area of the display screen, and the live broadcasting light effect image is displayed in a light effect picture display area of the display screen.
2. The method for controlling a live broadcast device according to claim 1, wherein the step of determining live broadcast scene information corresponding to the live broadcast image data in the scene control mode includes:
identifying text information in the live image data, and determining at least one live scene information matched with the text information through a preset semantic scene identification model;
if the matched live broadcast scene information is one, determining that the matched live broadcast scene information is the live broadcast scene information corresponding to the live broadcast image data;
If the matched live broadcast scene information is more than two, identifying live broadcast feature data of a target anchor in the live broadcast image data, wherein the live broadcast feature data comprises body posture features and hand action features; and respectively determining the matching degree of the preset live broadcast feature data of each piece of matched live broadcast scene information and the live broadcast feature data of the target anchor, and determining the live broadcast scene information with the highest matching degree as the live broadcast scene information corresponding to the live broadcast image data.
3. The method of controlling a live broadcast device according to claim 1, wherein the live broadcast room media data comprises a plurality of frames of live broadcast room image data, and the light effect display screen data comprises a plurality of frames of light effect image data;
the step of synthesizing display screen picture data according to the live broadcasting room media data and the light effect display picture data comprises the following steps:
determining a first synthesis area of each frame of live broadcasting room image data according to the live broadcasting room picture display area of the display screen;
determining a second synthesis area of each frame of light effect image data according to the light effect picture display area of the display screen;
and according to the first synthesis region and the second synthesis region, synthesizing each frame of live broadcasting room image data and a corresponding frame of light effect image data into one frame of display screen image data one by one, and obtaining a plurality of frames of synthesized display screen image data as the display screen picture data.
4. The control method of a live broadcast apparatus according to claim 1, further comprising the step of:
responding to a music light effect instruction triggered by a user, and setting a light effect control mode of the display screen to be a music control mode;
in the music control mode, when the live broadcast equipment plays music, acquiring currently played music data;
identifying rhythm feature information of the music data;
acquiring light effect display picture data matched with the rhythm characteristic information;
synthesizing display screen picture data according to the light effect display picture data and the live broadcasting room media data;
and controlling the display screen to display the picture data of the display screen.
5. The control method of a live broadcast apparatus according to claim 1, further comprising the step of:
responding to a gesture control instruction triggered by a user, and setting a light effect control mode of the display screen to be a gesture control mode;
in the gesture control mode, recognizing a gesture action of a target anchor in the live image data, and determining light effect display picture data corresponding to the gesture action;
and synthesizing display screen picture data according to the light effect display picture data and the live broadcasting room media data, and controlling the display screen to display the display screen picture data.
6. The method according to claim 5, wherein in the gesture control mode, the step of identifying a gesture action of a target anchor in the live image data, and determining light effect display screen data corresponding to the gesture action includes:
detecting a person target in the live image data, and identifying a face of the person target;
judging whether the face of the person target accords with the preset face characteristics with control authority, if so, determining that the person target is the target anchor.
7. The method according to claim 5 or 6, wherein in the gesture control mode, the step of identifying a gesture action of a target anchor in the live image data, and determining light effect display screen data corresponding to the gesture action includes:
detecting a hand target of a target anchor in a video frame, and acquiring image coordinate information of the hand target;
determining action characteristics of the hand target according to image coordinate information of the hand target of continuous multi-frame video frames;
and if the action characteristics of the hand target accord with any preset hand action characteristics, determining the light effect display picture data corresponding to the hand action characteristics.
8. The method for controlling a live broadcast device according to claim 1, wherein the live broadcast device comprises a movable base, the display screen is arranged on the movable base, and the live broadcast device further comprises a driving module arranged on the movable base and used for driving the display screen to rotate;
the method further comprises the steps of:
identifying viewing angle parameters of a target anchor in the live image data in response to a display screen adaptive control instruction;
determining a display direction control parameter corresponding to the viewing angle parameter;
and controlling the driving module to drive the display screen to rotate to the corresponding display direction according to the display direction control parameters.
9. The method of claim 8, wherein the step of identifying viewing perspective parameters of a target anchor in the live image data in response to a display screen adaptive control instruction comprises:
identifying the face features of a target anchor through a preset face feature identification model, wherein the face features comprise eyes and noses;
acquiring image coordinates of two eyes and image coordinates of a nose of the target anchor;
And determining the image coordinates of the middle points of the eyes of the viewer according to the image coordinates of the two eyes, calculating deflection angle data of the image coordinates of the middle points of the two eyes relative to the image coordinates of the nose by taking the image coordinates of the nose as a circle center, and determining the deflection angle data as viewing angle parameters of the target anchor.
10. The live broadcast equipment is characterized by comprising a camera module, a display screen, a communication module, a memory and a control module; the camera module, the display screen, the communication module and the memory are respectively connected to the control module; the camera module is used for shooting live image data in front of the display screen; the communication module is used for being in communication connection with the live broadcast server; the memory stores a computer readable program which, when executed by the control module, implements the steps of the method of any of claims 1-9.
CN202311753676.3A 2023-12-20 2023-12-20 Live broadcast equipment and control method thereof Active CN117440184B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311753676.3A CN117440184B (en) 2023-12-20 2023-12-20 Live broadcast equipment and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311753676.3A CN117440184B (en) 2023-12-20 2023-12-20 Live broadcast equipment and control method thereof

Publications (2)

Publication Number Publication Date
CN117440184A true CN117440184A (en) 2024-01-23
CN117440184B CN117440184B (en) 2024-03-26

Family

ID=89555639

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311753676.3A Active CN117440184B (en) 2023-12-20 2023-12-20 Live broadcast equipment and control method thereof

Country Status (1)

Country Link
CN (1) CN117440184B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113706719A (en) * 2021-08-31 2021-11-26 广州博冠信息科技有限公司 Virtual scene generation method and device, storage medium and electronic equipment
US20220366615A1 (en) * 2019-11-04 2022-11-17 Telefonaktiebolaget Lm Ericsson (Publ) See-through display, method for operating a see-through display and computer program
CN115543756A (en) * 2021-06-29 2022-12-30 腾讯科技(深圳)有限公司 Lamp effect display method, device and storage medium
CN115776750A (en) * 2022-11-30 2023-03-10 深圳市千岩科技有限公司 Lamp effect control method, device, product, medium and lamp effect control equipment
CN115884471A (en) * 2022-12-30 2023-03-31 深圳市千岩科技有限公司 Lamp effect control method and device, equipment, medium and product thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220366615A1 (en) * 2019-11-04 2022-11-17 Telefonaktiebolaget Lm Ericsson (Publ) See-through display, method for operating a see-through display and computer program
CN115543756A (en) * 2021-06-29 2022-12-30 腾讯科技(深圳)有限公司 Lamp effect display method, device and storage medium
CN113706719A (en) * 2021-08-31 2021-11-26 广州博冠信息科技有限公司 Virtual scene generation method and device, storage medium and electronic equipment
CN115776750A (en) * 2022-11-30 2023-03-10 深圳市千岩科技有限公司 Lamp effect control method, device, product, medium and lamp effect control equipment
CN115884471A (en) * 2022-12-30 2023-03-31 深圳市千岩科技有限公司 Lamp effect control method and device, equipment, medium and product thereof

Also Published As

Publication number Publication date
CN117440184B (en) 2024-03-26

Similar Documents

Publication Publication Date Title
CN106878820B (en) Live broadcast interaction method and device
US11231587B2 (en) Information processing apparatus and image display method
CN112135673A (en) Site mapping for virtual reality viewing for electronic athletics
US10963140B2 (en) Augmented reality experience creation via tapping virtual surfaces in augmented reality
JP6298561B1 (en) Program executed by computer capable of communicating with head mounted device, information processing apparatus for executing the program, and method executed by computer capable of communicating with head mounted device
US9898850B2 (en) Support and complement device, support and complement method, and recording medium for specifying character motion or animation
US10896322B2 (en) Information processing device, information processing system, facial image output method, and program
CN114071180A (en) Live broadcast room display method and device
JP6298563B1 (en) Program and method for providing virtual space by head mounted device, and information processing apparatus for executing the program
JP6719633B1 (en) Program, method, and viewing terminal
US10630870B2 (en) System and method for augmented reality movie screenings
CN109302631B (en) Video interface display method and device
CN114201095A (en) Control method and device for live interface, storage medium and electronic equipment
US20240196025A1 (en) Computer program, server device, terminal device, and method
JP2024019661A (en) Information processing device, information processing method, and information processing program
KR101643102B1 (en) Method of Supplying Object State Transmitting Type Broadcasting Service and Broadcast Playing
CN117440184B (en) Live broadcast equipment and control method thereof
JP6951394B2 (en) Video distribution system that distributes videos including messages from viewers
CN106331525A (en) Realization method for interactive film
CN116962745A (en) Mixed drawing method, device and live broadcast system of video image
JP2019012509A (en) Program for providing virtual space with head-mounted display, method, and information processing apparatus for executing program
US11882172B2 (en) Non-transitory computer-readable medium, information processing method and information processing apparatus
JP7442979B2 (en) karaoke system
JP2016166928A (en) Performance device, performance method, program, and amusement system
US8217768B2 (en) Video reproduction apparatus and method for providing haptic effects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant